• Breaking News

    Tuesday, May 28, 2019

    Is a XOR scrambler secure? Computer Science

    Is a XOR scrambler secure? Computer Science


    Is a XOR scrambler secure?

    Posted: 27 May 2019 05:11 PM PDT

    If I have an array of 0s and 1s, and I repeatedly XOR it with itself, is it secure enough so the original data will be hard to find?

    submitted by /u/f4gc9bx8
    [link] [comments]

    Exploring Liveness of Avalanche

    Posted: 27 May 2019 11:17 PM PDT

    Cannot decide what to choose for Double Major (Economics or EE)

    Posted: 27 May 2019 05:31 PM PDT

    Hello everyone. I completed my first year in Computer Engineering with really good scores and will start a Double Major Program next year. I have Electric-Electronic Engineering in mind but also considering Economics. I am reluctant to choose Business or Endustrial Engineering because I think that to complete a MBA after graduation would be a better choice. I will definetely complete a Masters Degree after graduation in either case. I am highly interested in machine learning and it's applications for stocks and market analysis and considering to complete Economics degree to be able to get a better understanding about markets. However, I also love physics and really into electronics. Do you have any recommendations? If you have any suggestions rather than Economics and EE, I am open to that too. Thanks in advance.

    submitted by /u/TheAncientLibrarian
    [link] [comments]

    How did the Bombe machine work and does it have to do anything with Turing Machines?

    Posted: 27 May 2019 07:02 PM PDT

    Hey guys, first of all sorry if this is not the correct subreddit. So, I recently watched the Imitation Game again for the 3rd time, and I also had a uni exam on Theory of Computation where most of the stuff was Finite Automatas and Turing Machines. So I was wondering if the Bombe machine was actually a turing machine or not. I know how the Enigma worked but I couldn't really find any youtube video to see how the Bombe worked and if it has to do anything with Turing Machines.

    submitted by /u/duckafick
    [link] [comments]

    Bridson Poisson Disk Sampling Questions

    Posted: 27 May 2019 03:53 PM PDT

    I recently saw a post where someone mentioned the Bridson Poisson Disk Sampling Algorithm. This paper has been cited a lot (at least for by my field's standards, and the algorithm appears in a lot of blog posts). However, I was reading through the paper and I'm kind of confused about some of the claims they make and then have some general questions.

    From what I understand, their analysis is roughly that: 1. there are 2N-1 iterations 1. each iteration is O(k), but k is constant Therefore, the algorithm is linear time.

    I'm not convinced that the second statement is true. A spatial grid, where at most one point can fit in a grid, is used to speed up the time it takes to check if a new candidate point is too close to any of the previous points. For a fixed dimension, this allows the algorithm to only have to check the distance a constant number of previous points. However, as the dimension of the space is scaled, the number of adjacent cells grows exponentially. So eventually the dimension is large enough and the number of points isn't too big, it will be faster to just check all the previous points individually (linear time per iteration). Is it not misleading to write in the title that this is fast in arbitrary dimension?

    I also see references to "Poisson Disk Distribution". Does anyone have a definition for this? From this paper, it seems like for some region, you generate the Poisson Disk Distribution by picking points one at a time uniformly at random from the space, and discarding them if they are within some tolerance r of any other points. I guess this should be equivalent to deleting a ball of radius r from any placed point, and then sampling uniformly at random from the remaining set. But then the algorithm in the paper isn't even generating this distribution.

    submitted by /u/h8a
    [link] [comments]

    Geoffrey Hinton Leads Google Brain Representation Similarity Index Research Aiming to Understand…

    Posted: 27 May 2019 08:30 AM PDT

    Short Certified Course Suggestions?

    Posted: 27 May 2019 12:01 PM PDT

    Hello. I am a computer science graduate and the company I work for wants to send me for a course in dev/data analysis/AI/full stack dev. I can take a course any where in the world but I prefer to get a certificate by the end of it. the course should be around 5 days. Please recommend me something I am on a tight deadline. I have a good foundation in development and would like to learn more about mobile dev (IOS).

    submitted by /u/hopelessprogrammer01
    [link] [comments]

    What are the advantages of solving constrained SAT instances in poly-time?

    Posted: 27 May 2019 03:03 PM PDT

    Recently, I have successfully created an algorithm that completes latin squares in poly-time for Sudoku when I implement some type of constraint.

    Figuring that this will work for n! subsets out of all n^2 x n^2. I found out there is equivalent and non-finite sub-sets of SAT that can be solved by the same algorithm. By converting those n! subsets Sudoku's into SAT and implement the same constraints just like the Sudoku solver and I'm able to solve those easy SAT instances as well.

    I've already published this algorithm on the internet, but its dispersed and unorganized.I'm in the Conclusion process that my experimentation yielded out the way I've intuitively expected it to be. Now, these constrained instances contain 100,000s of clauses and can be solved in poly-time when given a special clause just like the n x n box described in the links. For anyone, whose interested in some discussion give me your thoughts and opinions on how useful it would be to convert my language into something like circuit-design.

    Why not reduce my n! of n^2 x n^2 Sudoku grids into SAT instances and see if it can reduce to a P problem, to prove that constrained SAT is in P?

    There just has to be some practicality in that. If I convert the constrained SAT instances into functional products such as AI. Would it be practical? What about circuit-design? Would I gain any significant power performance of designing circuits on a constrained SAT that can be solved in poly-time?

    here's the link to the algorithm and other pre-requisite information to fully understand my algorithm.

    I'm politely asking, that readers do not down-vote the OP's comments, if they do not understand my algorithm. If you don't understand, then please read and study my algorithm.

    This is a non-trivial algorithm and it takes non-trivial understanding.

    https://cs.stackexchange.com/questions/107183/will-this-algorithm-always-solve-a-constrained-sudoku-puzzle-in-quadratic-time

    https://www.reddit.com/r/computerscience/comments/btrl58/tested_validity_of_algorithim_that_solves_50/?utm_source=share&utm_medium=web2x

    submitted by /u/Hope1995x
    [link] [comments]

    Tested Validity of algorithim that solves 50! Sudoku Puzzles.

    Posted: 27 May 2019 01:55 PM PDT

    I've used a Sat-Solver after generating a 50x50 Sudoku Grid in poly-time. And, then removed elements to create a puzzle and then converted that Puzzle into a Sat instance.

    With this algorithm, I can semi-solve constrained 50! Sudoku Puzzles in poly-time given a correct box for mapping.

    Proof of the algorithm

    https://www.reddit.com/r/learnprogramming/comments/bsnrse/i_created_an_algorithm_that_generates_50_sudoku/

    https://cs.stackexchange.com/questions/107183/will-this-algorithm-always-solve-a-constrained-sudoku-puzzle-in-quadratic-time

    Here's the link to the Sudoku Box. Notice the top-left of the grid is the n x n box needed for 50!. The elements warp and sometimes it may need to be decided.

    https://cdn.hackaday.io/files/1652867071596224/output.txt

    Look for the top left box. When its 9! Sudoku. Look for the bottom-right.

    The mapping may warp the n x n boxes and make it difficult to semi-solve in poly-time.

    c c This is glucose 4.0 -- based on MiniSAT (Many thanks to MiniSAT team) c c ========================================[ Problem Statistics ]=========================================== c | | c | Number of variables: 125000 | c | Number of clauses: 11400 | c | Parse time: 0.10 s | c | | c | Preprocesing is fully done c | Eliminated clauses: 0.46 Mb | c | Simplification time: 0.26 s | c | | c ========================================[ MAGIC CONSTANTS ]============================================== c | Constants are supposed to work well together :-) | c | however, if you find better choices, please let us known... | c |-------------------------------------------------------------------------------------------------------| c | | | | c | - Restarts: | - Reduce Clause DB: | - Minimize Asserting: | c | * LBD Queue : 50 | * First : 2000 | * size < 30 | c | * Trail Queue : 5000 | * Inc : 300 | * lbd < 6 | c | * K : 0.80 | * Special : 1000 | | c | * R : 1.40 | * Protected : (lbd)< 30 | | c | | | | c ==================================[ Search Statistics (every 10000 conflicts) ]========================= c | | c | RESTARTS | ORIGINAL | LEARNT | Progress | c | NB Blocked Avg Cfc | Vars Clauses Literals | Red Learnts LBD2 Removed | | c ========================================================================================================= c last restart ## conflicts : 0 62475 c ========================================================================================================= c restarts : 1 (0 conflicts in avg) c blocked restarts : 0 (multiple: 0) c last block at restart : 0 c nb ReduceDB : 0 c nb removed Clauses : 0 c nb learnts DL2 : 0 c nb learnts size 2 : 0 c nb learnts size 1 : 0 c conflicts : 0 (0 /sec) c decisions : 62476 (0.00 % random) (169725 /sec) c propagations : 66200 (179842 /sec) c conflict literals : 0 (-nan % deleted) c nb reduced Clauses : 0 c CPU time : 0.368101 s s SATISFIABLE SAT 

    What, I find is strange is that my quadratic time poly-solver might be slower than the brute-forcer under certain circumstances

    3 months have paid off. Successfully, executed. Sweet. I'm ready....

    Heck, I've created my own variation of Sudoku called Max-SAT Sudoku. Its a reduction of Sudoku < p Max-SAT and then converting that Max-SAT instance into a n^2 x n ^ 2 grid. Its completing the maximum amount of latin squares out of an invalid puzzle

    :)

    With the poly-time semi-solver I can solve constrained SAT instances with 100,000s of variables in poly-time. Wow.

    submitted by /u/Hope1995x
    [link] [comments]

    Quick question about left-recursion

    Posted: 27 May 2019 09:45 AM PDT

    So I'm reading this link about recursive-descent parsing and for a left-recursive grammar it gives the example of:

    E → T + E

    E → T

    however, earlier it presents as an acceptable grammar this:

    E → T

    E → ( E + E )

    T → n

    And I'm having trouble understanding why that is not also left-recursive; rather why does the addition of parenthesis not also make it recursive? Are grammars only recursive if their left-most or right-most symbols match the a specific production?

    submitted by /u/tkld
    [link] [comments]

    What are the advantages of m-way search trees over binary search trees?

    Posted: 27 May 2019 08:24 AM PDT

    Theory of integer long division explained — along with its 4+ alternatives

    Posted: 27 May 2019 08:12 AM PDT

    No comments:

    Post a Comment