• Breaking News

    Friday, December 18, 2020

    CompSci Weekend SuperThread (December 18, 2020) Computer Science

    CompSci Weekend SuperThread (December 18, 2020) Computer Science


    CompSci Weekend SuperThread (December 18, 2020)

    Posted: 17 Dec 2020 05:04 PM PST

    /r/compsci strives to be the best online community for computer scientists. We moderate posts to keep things on topic.

    This Weekend SuperThread provides a discussion area for posts that might be off-topic normally. Anything Goes: post your questions, ideas, requests for help, musings, or whatever comes to mind as comments in this thread.

    Pointers

    • If you're looking to answer questions, sort by new comments.
    • If you're looking for answers, sort by top comment.
    • Upvote a question you've answered for visibility.
    • Downvoting is discouraged. Save it for discourteous content only.

    Caveats

    • It's not truly "Anything Goes". Please follow Reddiquette and use common sense.
    • Homework help questions are discouraged.
    submitted by /u/AutoModerator
    [link] [comments]

    The Most Popular Programming Languages - 1965/2020 - Statistics and Data

    Posted: 18 Dec 2020 03:03 AM PST

    Is MPI still popular in distributed programming?

    Posted: 17 Dec 2020 06:29 PM PST

    MPI (e.g. provided by OpenMPI) was very popular in clusters at least in 2000s and early 2010s.

    Is it correct that MPI is much less heard of nowadays? Is MPI still popular in distributed programming?

    What are alternatives to MPI? Are there some other programming APIs/models replacing MPI?

    For example,

    • in traditional scientific computation,
    • in big data and machine learning or data mining, (versus MapReduce, relevant part of Spark, ...)
    • in internet/network services (versus RPC, web services, message queues/brokers, publisher-subscriber, ...)
    • ...

    Thanks.

    submitted by /u/timlee126
    [link] [comments]

    [R] WILDS: Benchmarking Distribution Shifts in 7 Societally-Important Datasets

    Posted: 17 Dec 2020 02:47 PM PST

    One of the significant challenges for deploying machine learning (ML) systems in the wild is distribution shifts — changes and mismatches in data distributions between training and test times. To address this, researchers from Stanford University, University of California-Berkeley, Cornell University, California Institute of Technology, and Microsoft, in a recent paper, present "WILDS," an ambitious benchmark of in-the-wild distribution shifts spanning diverse data modalities and applications.

    Here is a quick read: WILDS: Benchmarking Distribution Shifts in 7 Societally-Important Datasets

    The paper Wilds: A Benchmark of in-the-Wild Distribution Shifts is on arXiv. The WILDS Python package and additional information are available on the Stanford University website. There is also a project GitHub.

    submitted by /u/Yuqing7
    [link] [comments]

    Reading a formal description of Turing Machine

    Posted: 18 Dec 2020 02:48 AM PST

    If a Turing machine is represented like this :

    M(<M^(') , <M^(''),x>>)

    If I say that the Turing machine M is getting the encoding of M' and a encoding of M'' with encoding of input x. Is it right to say that?

    submitted by /u/sachal10
    [link] [comments]

    Is the number of bytes of memory the following array has from uint32_t x[20]; 80 bytes? Or is it 640 bytes?

    Posted: 18 Dec 2020 05:20 AM PST

    College Freshman side project for internship/resume

    Posted: 18 Dec 2020 03:45 AM PST

    Hey guys, I made a 3x3 sliding puzzle solver using A* algorithm on Python. I was wondering if this side project would be good enough to find an internship. All of the code is from scratch, none is copied from online.

    Thank you

    submitted by /u/michael8pho
    [link] [comments]

    Thread priority problem

    Posted: 17 Dec 2020 05:24 PM PST

    I have an arbitrary number of threads and an arbitrary number of priority levels. The threads are triggered by some external event and need to grab a global lock around some central context to accomplish their work. So events are constantly streaming in while a single thread has the central context lock, when that lock is released the highest priority thread waiting for that lock should be the next to acquire it. It would be possible, for example, that lower priority threads are stuck waiting for a while when there's a burst of high priority events.

    I'm trying to come up with a solution using posix synchronization methods, I'm sketching something out with priority semaphores that's not quite there.

    Edit: adding to this, "arbitrary number of priority levels" is just for theoretical sake here. The real world implementation will be like 5, so it's fine if it involves a bunch of individual locks, etc, the implementation will be manageable.

    submitted by /u/rro99
    [link] [comments]

    [D] 2020 in Review | 10 AI Papers That Made an Impact

    Posted: 17 Dec 2020 04:28 PM PST

    Much of the world may be on hold, but AI research is still booming. The volume of peer-reviewed AI papers has grown by more than 300 percent over the last two decades, and attendance at AI conferences continues to increase significantly, according to the Stanford AI Index. In 2020, AI researchers made exciting progress on applying transformers to areas other than natural-language processing (NLP) tasks, bringing the powerful network architecture to protein sequences modelling and computer vision tasks such as object detection and panoptic segmentation. Improvements this year in unsupervised and self-supervised learning methods meanwhile evolved these into serious alternatives to traditional supervised learning methods.

    As part of our year-end series, Synced highlights 10 artificial intelligence papers that garnered extraordinary attention and accolades in 2020.

    Here is a quick read: 2020 in Review | 10 AI Papers That Made an Impact

    submitted by /u/Yuqing7
    [link] [comments]

    I'm Being Paid To Study Computer Science

    Posted: 17 Dec 2020 08:06 PM PST

    OK, I'm not really being paid to study Computer Science but my employer has not given me any projects for years and years and years. Yet I am still expected to report on what I'm doing every day. So I have been studying every topic in Computer Science I can think of. I have studied more math than I ever wanted to know; combinatorics, graph theory, linear algebra, Boolean algebra, etc. I have studied multiple programming languages including; Java, C#, Python, C++, COBOL, and Fortran.

    I am primarily a web developer but I mastered HTML, JavaScript, CSS, and SQL a long time ago.

    My problem now is that I have run out of things to study. Recently I consumed an entire book on statistics to learn how to use R Studio. I have also read books specifically on Bayesian Statistics. Currently I am learning LISP which is something different for me. The syntax is a bit strange but it is making sense to me.

    Do you have any suggestions on what else I could study? It can't be anything too advanced like machine learning which I really struggle to understand. And it has to be something useful that I can justify spending company time to learn. Obviously my employer is very lenient on that score!

    submitted by /u/webauteur
    [link] [comments]

    Considering EE and Cs double major

    Posted: 17 Dec 2020 04:34 PM PST

    Hello, I'm a freshmen in university and I'm considering a double major in EE and Cs. Is it do-able ? Because I've heard that they have somewhat of an overlap

    submitted by /u/lightofday1
    [link] [comments]

    No comments:

    Post a Comment