• Breaking News

    Monday, February 3, 2020

    How much have Computer Science Programs changed over the past 20 and 30 years? Computer Science

    How much have Computer Science Programs changed over the past 20 and 30 years? Computer Science


    How much have Computer Science Programs changed over the past 20 and 30 years?

    Posted: 02 Feb 2020 06:54 PM PST

    So my dad got his BS in Computer Science from Stanford in 1991, and it got me thinking. How much have Computer Science programs changed over the past few decades? What's different today compared to back than. What things would a Computer Scientist know today that a Computer Scientist not know back then? Same vice versa

    submitted by /u/0_0_Mike
    [link] [comments]

    “Things I Believe About Software Engineering”

    Posted: 03 Feb 2020 02:47 AM PST

    Which subject to choose for semester?

    Posted: 03 Feb 2020 05:10 AM PST

    So I am student from India studying in undergrad CS course and currently in 6th semester. We are offered two subjects to choose from as elective which are "Operations Research" and "Data Mining and Data Warehousing" but as I am not aware of the what usage are of these subjects in industry. I am unable to decide which to choose So can anyone please shed some light on what purpose these subjects solve in industry and where they are used? I have syllabus content with me as pdf.

    submitted by /u/enfirius
    [link] [comments]

    [XPOST] - CRC and Hamming codes books recommendations

    Posted: 03 Feb 2020 04:13 AM PST

    Are there any examples of AI being implemented in the Justice System?

    Posted: 02 Feb 2020 05:20 PM PST

    I was interested in how we've been implementing AI in society. I know of the obvious stuff such as tech and medicine. Would love to hear about implementation of AI in the Justice System.

    submitted by /u/ArduinoMasterRace
    [link] [comments]

    Log4 Machine Language

    Posted: 02 Feb 2020 04:39 PM PST

    Logarithm Base 4 Machine Language

    Alexander Tung-cuu Le

    San Francisco State University

    Introduction

    Machine language today is built within a binary system of '0's and '1's which computers can translate into higher representations - creating more meaningful expressions like the string representation of the word: "true". The integrated circuitry of the motherboard of a computer includes control gates which regulate the flow of electrons. If you've heard of bits or bytes when talking about computer memory, you may have heard of a byte being represented as 8 bits - 8 digits where the value in that digit place can either be a '0' or a '1'. Let's take, for example, the letter (char) 't' - represented as this series of binary digits: '01110100'. This electron signal transcoding system, based on 8 bits, can be used to transcode 256 different computations. The ASCII character encoding only uses 127 of those computational byte patterns. Let's use one of those, unused by ASCII, byte patterns, which represent memory encoding for a machine, to agree with Figures 1a and 1b – to represent a base 4 polygon:

    https://preview.redd.it/9fsf7c2wvle41.jpg?width=566&format=pjpg&auto=webp&s=18df3b118913df984bb213052630e63340afc0da

    Imagine we had another key on our keyboard to input this new addition to the ASCII encoding system for: the square character, which would be encoded in memory as the following binary representation: '10101011.' In binary, each bit, or electron, is read at a single time interval – say, 1 second. The physical limitations of our standard conventions of time, for example, that single time unit of binary computing could be measured at nanoscales – making the time unit of measurement: 1 nanosecond. The quaternary machine language requires that two bits are read or encoded per individual time unit; so it presents to be the most efficient system to use a quaternary machine language that measures at 1.1 nanoseconds, to read two bits of information per that time interval (requiring only .1 additional units of time to determine if the second bit in the two-bit pair is an 'on' or 'off' signal). The advantages are obvious, in quaternary, we can read two bits per 1.1 time interval – 2:1.1; as opposed to, binary, where we get a 1:1 relationship. This means that with a log4 machine language, we are capable of receiving about 1.8 bits of information for every normal time interval (in binary, the normal time interval is: 1 nanosecond). See the following diagram to observe the concept of electron-wave counting and computational abstraction in binary – Figure 1a, and in quaternary – Figure 1b.

    Figure 1a) Electron Sine-wave Signaling to Binary Machine Language

    https://preview.redd.it/nykckzggsle41.jpg?width=1307&format=pjpg&auto=webp&s=47db12ea45ced15d01e3420b32937390c8c0a915

    In Figure 1a, the x-axis represents time where the y-axis represents amplitude of a signal. Each complete revolution of the sine wave represents a complete electron and can be represented as a '1' or an 'on' signal in machine language. The lack of electron wave activity over one time interval will signal a '0' or an 'off' signal in machine language. The red rectangular enclosures from Figure 1a represent the 1 nanosecond time intervals measured for computational purposes; and above them are the machine language encoding in memory, either '0' or '1'.

    Log4 Machine Language

    The log4 machine language reinterprets the binary system into a system which is best understood in the following assembly language representations, using these 12 symbols, two sets of symbolic meaning: 1) '+' and '-' – to represent a base 'on' or base 'off' signal, respectively; and 2) '0,' '1,' '2,' '3,' '4,' '5,' '6,' '7,' '8,' and '9' – to represent ten matrixes where we will measure the physical location for the potential of a trailing 'on' signal or 'off' signal within a .1 nanosecond interval; if the electron wave is not present within that .1 nanosecond time interval, then we can assume the trailing signal to be an 'off' signal; and reset the nano-clock. The table (Figure 2) below summarizes the semantic meaning and pattern representation of each symbol.

    Figure 2) Log4 Assembly Language

    Base Base Multi Multi Multi Multi Multi Multi Multi Multi Multi Multi
    Log4 Symbol + - 0 1 2 3 4 5 6 7 8 9
    Binary 1 0 01 10 01 10 01 10 01 10 01 10
    Nanoseconds 1.1 0.1 0.009 0.018 0.027 0.036 0.045 0.054 0.063 0.072 0.081 0.090
    Meaning on off off-on on-off off-on on-off off-on on-off off-on on-off off-on on-off
    Signal Multiplier by (x times) 1x 1x 2x 2x 3x 3x 4x 4x 6x 6x

    For the electron wave counting and computational abstraction, for the: square character – ('10101011') character imagined from earlier, and in quaternary, see Figure 1b.

    Figure 1b) Electron Sine-wave Signaling to Quaternary Machine Language

    https://preview.redd.it/ok1pddzisle41.jpg?width=1294&format=pjpg&auto=webp&s=14c88d2fdf9a1e321436b6e10383638ec113057e

    By stretching the wavelength of the electron to 1.1 nanoseconds, no stretch of the imagination with the discovery of dark energy; but also with the De Broglie wavelength equation – w = h/mv (where w equals wavelength, m equals the mass of the particle, and v equals the velocity of the particle), we can transcribe a '+' symbol – a base 'on' signal, which will have a wavelength equal to: w =h/m(v-1). By Louis De Broglie, we can stretch the wavelength of a particle by decreasing its velocity. What we'll be capturing is a photo signature of where and when the electron passes through a flood gate – within the .1 nanosecond interval – outlined by the blue rectangles. By designing a system which is capable of reacting in multiple channels from a photosensitive reaction over a .1 nanosecond interval, one would need to interpret the exact location of the electron during the .1 interval and create delays when sending a trailing 'on' signal to represent the ten signal multipliers from Figure 2. Once a '01' or '10' pattern is detected, the position of the trailing electron will determine what the next base signal will be multiplied by. See Figure 3 for the electron mapping for the ten signal multipliers.

    Figure 3) Electron Particle Mapping for Signal Multipliers

    https://preview.redd.it/tydiqu1ksle41.jpg?width=1468&format=pjpg&auto=webp&s=a641dd129923d62adb8c8f67624ad24a40adcefd

    By creating a photosensitive computational operation over the .1 nanosecond matrix to track the actual location of the electron, we can create many more symbolic meanings for machine language to incorporate into assembly. The rectangles in blue delineate the frame which the electron particle, black circle outline, is captured for a signal multiplier signal. The numbers represent the signal multiplier symbol in this log4 machine language. Each signal multiplier will set the current for the next 1.1 nanosecond wave of an electron, which a locally sourced electron pool will be consumed to operate on a linear and simultaneous electron-to-computational transcribing. For example, a log4 transcription of: '2+' will locally source two 'on' signal electrons of 1.1 nanosecond wavelength, equating to: '0111' – in binary; and if transcribed at: '6+,' the binary representation would be: '01111111.' The system, when designed with two electron sources, will create a lag of an additional 1 nanoseconds for the complete termination of the trailing signal – which its energy will be repurposed for activating which signal multiplying channel to flow through on the next base signal which will be multiplied, happening locally. The replication of the trailing base 'on' or 'off' signal will be determined from the proceeding signal, after the signal multiplier signal, in the 1 nanosecond interval – 'on' or 'off.' The signal multiplier signal should trigger the computer's logic gates to re-route to the locally sourced electron pool and circuitry, to assume the trailing base 'on' signal's integrity and flood the logic gates with the 1.1 nanosecond wavelength of electrons multiplied by the factor captured by signal multiplier signals from Figure 3 – so the actual ratio of bits per nanosecond, for '01' and '10' transcriptions, are: 2:2, or 1 bits per nanosecond. After one entire byte sequence is transcribed in binary machine language from the log4 version – ten bits are transcoded, the signal multiplier will reset to multiplying the next base signal by one; by default.

    This sort of "two-bit reading" machine will require that we capture and react to the location of the electron at two places: 1 nanoseconds in time and 1.1 nanoseconds in time. By observing the electron's place at these two points in time, we can determine whether the signal is 'on' or 'off,' and how 'on' or 'off' the next signal will be – by the signal multiplier system. At the 1 nanosecond mark, we will just need to determine if the electron is not present (since an 'on' signal can be detected during the .1 duration of the 1.1 mark). During the .1 duration of the 1.1 mark, we'll have to track and react to the velocity of the electron – to determine what factor to multiply our locally source base signaling system.

    Creating the Matrix

    A nanosecond is one-billionth of a second. The speed of light is 299, 792, 458 meters per second. The distance for a signal to travel on a nanosecond of time, using the speed of light as a constant, would be 0.299792458 meters. By calibrating the entire log4 system to 1.1 nanoseconds, we would need to create wavelengths which are 0.329771704 meters in length.

    To create the matrix which is able to capture and perform computational operations based upon the position of the electron in the .1 nanosecond interval, we'll need to create a metric system with .009 nanosecond intervals, and accuracy, to check for the initial detection of the electron within these intervals. The initial detection point of the electron, the Nanoseconds row from Figure 2, will indicate which signal multiplier to multiply the next base signal by; and thereby, trigger the entire system to locally source an electron pool and circuitry for signal production of base signals.

    Base Signals and Local Base Signals

    As mentioned before, base signals are measured at 1.1 nanoseconds. A complete revolution (2Ï€) within 1.1 nanoseconds is one base 'on' signal; an 'off' signal is the lack of presence of a signal at the 1 nanosecond mark.

    Local base signals are part of a different sub-system within the entire log4 system. They are triggered by the end of a trailing 'on' signal from the signal multiplier base signal. Through this parallel system, parallel to the core assembly processing system and with much of its operations occurring on a local end terminal, an electron pool is resourced for continued base signaling, where the multiplying factor for the next base signals (occurring locally) are calibrated by the signal multiplier. A switch mode system should be put in place to tell the system to switch to the logic circuits for local signal generation using the trailing base signal captured in the end of the .1 nanosecond interval from the entire 1.1 nanosecond wave-span, and the core processing system should have its internal logic clock reset to compensate for the extra .1 nanosecond which wouldn't be required to be accounted for after a signal multiplying signal has been received; this will calibrate the core system's clocks to be synchronized with the sub-system's clocks to retain data transmission integrity.

    Drawbacks to the Log4 Machine Language

    The log4 machine language has its drawbacks depending upon what byte pattern is being transcribed. For example, a '00000000' binary message would take 8.8 nanoseconds (compared to 8 nanoseconds in binary computation). Any repeating series of a bit digits, following a signal multiplier signal, which doesn't divide evenly by 2, 3, 4, or 6 will also see a decrease in computational speed (for example: '01111101' or '0++++0' would require 8.4 nanoseconds in log4; and '01110101' or '0++00' would require 8.2 nanoseconds in log4).

    These drawbacks occur for rarer byte patterns, as the commonality of finding repeating bit series of factors of: 2, 3, 4, and 6 are more common in byte structuring then finding byte patterns of base 1 – which are: '01' and '10.'

    By recognizing when these '01' and '10' patterns occur in byte sequences and creating more meaning -thereby, developing a system of electronic machinery, which can flood a locally electron-sourced sub-system through the appropriate amperage channels, dependent upon the location of the electron in the .1 nanosecond matrix, we can trigger a release of multiple locally-sourced electrons (varying from 1, 2, 3, 4, 6 electrons per unit of time) on the next single and incoming base signal; saving us telecommunication bandwidth, on number of bits transmitted – since the log4 machine language is capable of being interpreted by a machine in 12 unique symbols.

    submitted by /u/fullstackalexle
    [link] [comments]

    How to learn to read CS Scientific Papers

    Posted: 02 Feb 2020 07:45 AM PST

    I imagine this might sound like a pretty dumb thing. I am a second-year Computer Science Undergraduate and I thought that it is about time to learn how to read scientific papers (my plan for this year to read at least one paper every month or two). However, I am not sure where, to begin with. I don't have any prior similar experience and I wouldn't call myself really confident in maths (Although, I can understand concepts with a lot of Google searching and I don't mind that at all). The main goal out of it would be learning and getting more immerse into Computer science.

    So, the main thing I would like to ask is perhaps any of you now not so difficult papers that I could read and learn from in the beginning? (The date of the paper is not important could be classics from the 1950s, I would pretty much enjoy reading quality "vintage" work. topics also do not matter too much could be formal mathematics, proofs, AI etc) Or perhaps you know a list of such papers that I could utilize? Or a guide on how to read such articles?

    Thank you very much for your answers!

    submitted by /u/Lithuanian1dude
    [link] [comments]

    How to learn to read CS Scientific Papers

    Posted: 02 Feb 2020 07:45 AM PST

    I imagine this might sound like a pretty dumb thing. I am a second-year Computer Science Undergraduate and I thought that it is about time to learn how to read scientific papers (my plan for this year to read at least one paper every month or two). However, I am not sure where, to begin with. I don't have any prior similar experience and I wouldn't call myself really confident in maths (Although, I can understand concepts with a lot of Google searching and I don't mind that at all). The main goal out of it would be learning and getting more immerse into Computer science.

    So, the main thing I would like to ask is perhaps any of you now not so difficult papers that I could read and learn from in the beginning? (The date of the paper is not important could be classics from the 1950s, I would pretty much enjoy reading quality "vintage" work. topics also do not matter too much could be formal mathematics, proofs, AI etc) Or perhaps you know a list of such papers that I could utilize? Or a guide on how to read such articles?

    Thank you very much for your answers!

    submitted by /u/Lithuanian1dude
    [link] [comments]

    How to learn to read CS Scientific Papers

    Posted: 02 Feb 2020 07:45 AM PST

    I imagine this might sound like a pretty dumb thing. I am a second-year Computer Science Undergraduate and I thought that it is about time to learn how to read scientific papers (my plan for this year to read at least one paper every month or two). However, I am not sure where, to begin with. I don't have any prior similar experience and I wouldn't call myself really confident in maths (Although, I can understand concepts with a lot of Google searching and I don't mind that at all). The main goal out of it would be learning and getting more immerse into Computer science.

    So, the main thing I would like to ask is perhaps any of you now not so difficult papers that I could read and learn from in the beginning? (The date of the paper is not important could be classics from the 1950s, I would pretty much enjoy reading quality "vintage" work. topics also do not matter too much could be formal mathematics, proofs, AI etc) Or perhaps you know a list of such papers that I could utilize? Or a guide on how to read such articles?

    Thank you very much for your answers!

    submitted by /u/Lithuanian1dude
    [link] [comments]

    No comments:

    Post a Comment