• Breaking News

    Tuesday, October 23, 2018

    What are some CompSci books (Or books that are close to the field) you can read to relax? Something interesting which I can enjoy without taking notes or re-reading a definition over and over again. Computer Science

    What are some CompSci books (Or books that are close to the field) you can read to relax? Something interesting which I can enjoy without taking notes or re-reading a definition over and over again. Computer Science


    What are some CompSci books (Or books that are close to the field) you can read to relax? Something interesting which I can enjoy without taking notes or re-reading a definition over and over again.

    Posted: 22 Oct 2018 05:23 AM PDT

    Hey!

    I would sometimes love to open a book about CS, read and relax, but most books in the field are so dense that it hurts my head to read them after a long day of math and programming. So I am not talking about 'Introduction to programming 101', 'Complexity Theory 101' etc. I know those by heart. Instead something creative? Crazy? Philosophical? Something where I can read without being scared that by missing one definition I am completely lost on the next page.

    What are your favorites that one can read without too much focus? I loved the more simple books (Secrets and Lies, etc.) by Bruce Schneier for example. Cool topic (IntSec) but not too much hard science. Goedel, Escher, Bach was pretty fun as well. I can enjoy those with a glass of Whiskey in my hand :)

    submitted by /u/RadiantRectangle
    [link] [comments]

    Simple invariants distinguishing at least some strongly regular graphs and their vertices

    Posted: 23 Oct 2018 03:13 AM PDT

    Bring back realism in technical interviews

    Posted: 23 Oct 2018 03:10 AM PDT

    What object can represent a programmer?

    Posted: 22 Oct 2018 08:05 PM PDT

    Hello everyone,

    I'm getting my BS in computer science, however I still have to take the General Ed classes in between the classes that go towards my major. One of those classes is communications and I got my first presentation assignment today.

    The assignment was to bring a physical object to represent your major and who you are. My things is coding and creating programs, so far I cannot think of what to bring... I thought about taking my usb flash drives with all my projects but that doesn't really seem right.

    I guess my question is, if you guys had to pick a physical object to represent you as a programmer, what would it be and why?

    submitted by /u/jvxn21
    [link] [comments]

    Need a Subject Matter Expert for a Technical report I am doing about Artificial Intelligence

    Posted: 22 Oct 2018 06:02 PM PDT

    So I am working on a technical report that is due tomorrow and realized that I am missing one of the two interviews I was to conduct for the paper.

    I would be very grateful if someone qualified could answer a couple questions about Artificial Intelligence.

    Questions:

    1. How did you learn what you know about artificial intelligence?

    2a. Have you implemented a form of artificial intelligence before? (Machine learning, GOFAI, etc.)

    2b. If yes, could you briefly elaborate?

    1. Do you currently use it in your career or in research?

    2. Between Symbolic Artificial Intelligence, Statistical Machine Learning, and Biological Machine learning, which of the three do you think are the most popular in research?

    submitted by /u/Crumbie028
    [link] [comments]

    GPGPU/FPGA Programming question: Looking to accelerate data processing.

    Posted: 22 Oct 2018 03:56 PM PDT

    Good afternoon! Hopefully, I came to the right place!

    Recently, I have taken an interest in performing combinatorial analysis for the game of 21 (blackjack) and attempted to use my AMD APU to try and thread the program via the 4 cores on chip. Generating the proper strategy and expectations has proven to be a success, as well as taking very few seconds (around 12) to compute the optimal strategy for any given rule(s) and deck composition. However, when it comes to compute other deck compositions, it would take 12.63 years to compute all deck states...for a single deck. So, another approach would be to simulate a select limit of random deck subsets via a monte carlo method. This will work! However, the 12 seconds per analysis is a huge bottleneck! Randomly selecting just 10E6 subsets would take around 139 days. Not feasible with my current CPU, even with threading.

    A primer on how I am analysing the strategy chart(s):

    1.) I first start by computing what the overall expectation for standing is by taking a set of n_cards for the player and enumerating all possible dealer hand sets. I compare the outcome of each specific dealer total against the current hand total of the player hand.

    2.) I then compute the hitting expectation by taking a player hand set {23} and finding matching hand sets for x:2->A, so that we can compute the weighed expectation of hitting to {232}, {233}, {234}, ...,{23A}. We do this by computing all hard 21's first, then hard 20's...down to all hard 12's; then do the same by computing hitting all soft 21's to soft 13's; then compute all hard 11's to hard 5; and finally, we compute hitting all pair cards ({22}, {33}, ..., {TT}, {AA}.)

    3.) Doubling is the same as hitting, except we only draw one card and stand for any 2_card player hand. We return the expectation of drawing to a specific 3_card hand and multiply the expectation by 2 times the probability of that drawn card.

    4.) For splitting, we will take steps 1.) to 3.), we will repeat 1->3 up to 11 times. Why? We are computing the expectation of splitting to a new hand by removing the rank for which we are splitting. (For example: We want to split {22}. We will compute standing, hitting, doubling for a given deck subset, minus one of the ranks we are splitting. If we originally split {22} for a deck subset of {4 4 4 4 4 4 4 4 16 4: 2-A} or {3 1 4 3 4 4 3 1 14 3: 2-A}, our new subsets would be {3 4 4 4 4 4 4 4 16 4: 2-A} or {2 1 4 3 4 4 3 1 14 3: 2-A}.)

    5.) After all of this, we should get back out optimal strategy with respect to the computed expectation(s) for each possible decision for each player hand.

    Now, I have been looking into seeing if there is a quicker way of analysing the strategy using either GPGPU or FPGA/ASIC. I first looked at GPU programming as a possible route, however, I am not sure if it is feasible as there are 3072 unique player hand states, meaning there will be 3072 unique data structures to compute for standing, hitting, doubling, and splitting.

    With all this said, I was hopping there was a way to perform some form of simultaneous processing, where the state of each expectation changes dependent on the change in the rules and/or the deck state. That is, rather than recompute each expectation for every change in rules/deck_state, there would be some dependence on the output for each data point. For example: We have three data points named A, B, and C. We also have state data {1 2 3}. For each data point, there is a state equation that we posses:

    A = data[0]

    B = A + data[1]

    C = B +data[2]

    Under serial processing, we would evaluate A first for 1, B for 2 plus A, and C for 3 plus B. If we change data to be {2 2 3}, then A would be 2, B would be 4, and C would be 7. Under serial processing, this would take several cycles per data point, for each data point. Is there a method under FPGA where the change in state is instantaneous? That is if we change any point of data, that A, B, and C will change after data changes, simultaneously?

    If not, what benefits does FPGA/GPGPU processing offer to a programmer who wants to accurately and quickly compute large volumes of floating point data in a near instant? Basically, is there a way to rapidly compute the optimal strategy of a blackjack game that is faster than multi-threaded CPU processing that is not 12 seconds long?

    Thank you for your time!

    submitted by /u/huskyShad0w
    [link] [comments]

    Why are the interesting results in 3D parallel mesh/grid not discussed more?

    Posted: 22 Oct 2018 09:48 AM PDT

    It seems that every week I hear some development in quantum computing and it's possibilities of reducing the big-O times of some algorithms discussed.

    Some interesting results in reducing the O time of sorting to a n1/3 logn happen on 3D meshes(coming from variations of parallel shearsort), and similar reductions happen with wide classes of commonly used algorithms,and occasionally I find papers puporting greater speedups with clever 3D manipulations.

    Why is there comparatively little discussion of those interesting results? They obviously matter a great deal,reducing O(n) to O(n1/3).

    submitted by /u/Studying_All_Day
    [link] [comments]

    How Does One Learn Computer Science From The Ground Up?

    Posted: 22 Oct 2018 12:22 PM PDT

    I am interested in exploring a road-map of learning CS from the ground up. Basically, lambda calculus -> Turing -> Etc... I assume that is nowhere near correct, but I hope you can get the idea.

    submitted by /u/201805142348
    [link] [comments]

    How many multiplications does the execution of the Horner function use as a function of n? My work is below. Seems obvious, but just here to make sure my answer is correct.

    Posted: 22 Oct 2018 10:40 AM PDT

    Here's the function (in pseudo code):

    function y = Horner([a₀, ..., aâ‚™], x) y = aâ‚™ for i = n-1 downto 0: y = yx + aáµ¢ 

    example: Execute y = Horner([2,2,3,1,4], 2)

    initialize y = 4.

    steps in for loop:

    1) y = 4*2 + 1 = 9

    2) y = 9*2 + 3 = 21

    3) y = 21*2 + 2 = 44

    4) y = 44(2) + 2 = 90

    By the looks of it, the Horner function requires n-1 multiplications, correct?

    submitted by /u/PuppyLand95
    [link] [comments]

    In a perfect world, What would the pros and cons of all software being open source?

    Posted: 22 Oct 2018 01:12 PM PDT

    Help plz: I’m a 14 year old

    Posted: 22 Oct 2018 08:58 AM PDT

    So, I'm really interested in computer science and just wanted to know if there were any books I should read. P.S It'd be really helpful if they were suited for a brain of a 14 yr old.

    submitted by /u/TinyMid
    [link] [comments]

    No comments:

    Post a Comment