Enthusiasit: As I learn to code, what is the true complexity of this algorithim? Computer Science |
- Enthusiasit: As I learn to code, what is the true complexity of this algorithim?
- Abstractions of a Computer
- Did Google achieve quantum supremacy or not?
- Coding in the 3DO M2 SDK from 1996 - my weirdest CS use case yet lol
- Google’s ALBERT Is a Leaner BERT; Achieves SOTA on 3 NLP Benchmarks
- Coffee-Python-Popcorn-Netflix! (how Netflix is using Python)
Enthusiasit: As I learn to code, what is the true complexity of this algorithim? Posted: 27 Sep 2019 07:21 PM PDT Someone has mentioned it could be sub-exponential. But, I'm not familiar with sub-exponential time. My knowledge is limited to O(n), 2^n, O(n^k) definitions only. I think it could be O(n). So what's the running time for input bit-length? Is it safe to say its O(n)? Or is it subexponential? (Not sure what that is) [link] [comments] |
Posted: 27 Sep 2019 09:40 PM PDT Having recently graduated from school as an electrical engineer, I have realized that my knowledge about computers is still very limited. I have taken digital design courses where we learned about the fundamentals of transistors, logic gates, registers, D flip flops and so on. We even implemented a very simple multicycle processor in SystemVerilog on an FPGA. From this course I was able to understand how a language like C or C++ is translated into assembly and then machine code which a multicycle processor (or any other variant) can interpret and execute. But since I spent most of my time doing analog design and not taking computer science courses, I have missed out on quite a bit of crucial knowledge. For instance, I am not sure what the next level of abstraction past the processor is. Is it implementing an operating system? Boot loader? Kernel? And if so what order do you perform this in? In addition, how do you have the hard drive communicate with memory and then the CPU (how to implement these buses)? How does one write to physical memory and how does kernel space and user space differ? If anyone has any resources whether it be textbooks, lectures, notes, ect. on any of these topics, I would very much appreciate it. Having this low level understanding and high level of understanding of how a computer theoretically works without being able to bridge the two areas is a little frustrating. My apologies if my question is not very clear cut. I simply am lost and could use some direction. Thank you for your time and help. [link] [comments] |
Did Google achieve quantum supremacy or not? Posted: 27 Sep 2019 10:09 AM PDT |
Coding in the 3DO M2 SDK from 1996 - my weirdest CS use case yet lol Posted: 27 Sep 2019 05:14 AM PDT Thankfully the SDK manual is comprehensive and I was able to source the appropriate version of MPW on Mac. Still getting 100% used to everything but I have source compiling and the SDK laying out .image files with the correct file structure (Opera) and with the correct folios and drivers for the output to run on real hardware. It's quite a round trip. Files from Win 10 to Mac, compile and create .image file, files back to Win 10, swap file extension to .ISO, burn, test, and keep coding from there for any fundamental changes 😣 But it's working. Took a good month to get the sdk, get it correctly installed (a lot of tree issues on Mac) and get it running correctly. But hey...coding for an unreleased platform and testing on hardware that's one of about fifteen pieces worldwide is useful right 🤣? [link] [comments] |
Google’s ALBERT Is a Leaner BERT; Achieves SOTA on 3 NLP Benchmarks Posted: 27 Sep 2019 09:21 AM PDT |
Coffee-Python-Popcorn-Netflix! (how Netflix is using Python) Posted: 27 Sep 2019 05:15 AM PDT |
You are subscribed to email updates from Computer Science: Theory and Application. To stop receiving these emails, you may unsubscribe now. | Email delivery powered by Google |
Google, 1600 Amphitheatre Parkway, Mountain View, CA 94043, United States |
No comments:
Post a Comment