• Breaking News

    Tuesday, December 1, 2020

    ‘It will change everything’: DeepMind’s AI makes gigantic leap in solving protein structures Computer Science

    ‘It will change everything’: DeepMind’s AI makes gigantic leap in solving protein structures Computer Science


    ‘It will change everything’: DeepMind’s AI makes gigantic leap in solving protein structures

    Posted: 30 Nov 2020 11:01 AM PST

    [N] ‘Biology’s ImageNet Moment’ – DeepMind Says Its AlphaFold Has Cracked a 50-Year-Old Biology Challenge

    Posted: 30 Nov 2020 11:13 AM PST

    Google's UK-based lab and research company DeepMind says its AlphaFold AI system has solved the protein folding problem, a grand challenge that has vexed the biology research community for half a century.

    Here is a quick read: 'Biology's ImageNet Moment' – DeepMind Says Its AlphaFold Has Cracked a 50-Year-Old Biology Challenge

    submitted by /u/Yuqing7
    [link] [comments]

    If you're a programmer could you help me by taking this survey?

    Posted: 30 Nov 2020 08:45 PM PST

    I'm sorry if this is against the rules of the subreddit, I checked it and it didn't say anything about it.

    The survey is really simple and quick. If you feel like it, please take a look.

    https://forms.gle/pjXre1Hxo7B8RX129

    Thank you in advance.

    submitted by /u/BernardoPiedade
    [link] [comments]

    But it doesn’t help us with P=NP, still very exciting nonetheless.

    Posted: 30 Nov 2020 12:08 PM PST

    [N] Synced Tradition and Machine Learning Series | Part 1: Entropy

    Posted: 30 Nov 2020 10:42 AM PST

    A cornerstone of information theory is the idea of quantifying how much information there is in a message. More generally, to quantify the information of an event. The basic measure used to quantify information in this regard is entropy, which will be the topic of this article.

    Classically, information-theoretic quantities such as entropy and relative entropy arise again and again in response to fundamental questions in communication and statistics. These quantities have also found their way into machine learning, where they have become increasingly important.

    The concept of information is too broad to be captured completely by a single definition. However, for any probability distribution, the concept of entropy is defined to provide properties that agree with the intuitive notion of what a measure of information should be. Other related notions of uncertainty include conditional entropy H (X|Y), which is the entropy of a random variable X conditional on the knowledge of another random variable Y.

    In this review, we introduce most of the basic definitions required for the subsequent development of information theory. After defining concepts such as entropy and mutual information, we establish useful properties such as the chain rule of the quantities. Finally, we try to provide some examples of these concepts in the realm of machine learning.

    Here is the blog link: Synced Tradition and Machine Learning Series | Part 1: Entropy

    submitted by /u/Yuqing7
    [link] [comments]

    Question to all Comp Sci Graduates

    Posted: 30 Nov 2020 08:00 AM PST

    Hey everyone,

    I am a college freshman currently pursuing a comp sci major. However, I am trying to aim for a premed track, which means that I need to have a high gpa.

    I was wondering, for everyone who graduated with a comp sci major/degree, what was your college gpa?

    Thanks in advance!

    submitted by /u/sSungjoon
    [link] [comments]

    U will know how we work on the simulator and the test of continuous Bernoulli distribution using C programming. [free ebook & source codes]

    Posted: 30 Nov 2020 09:19 AM PST

    📚Book name: Continuous Bernoulli distribution-simulator and test statistic

    Method: math, stat, computer programs(cpp)

    Contents:

    1️⃣ For CB,

    1.1 Generated values of CB

    1.2 Sampling distribution, the point estimator, confidence interval, test statistic, and goodness of fit.

    2️⃣ two CB populations: test statistic and confidence interval

    3️⃣ Continuous Trinomial distribution

    Continuous Bernoulli distribution is for the variational autoencoders in deep learning. Now u can test the values of CB, work the test under two continuous Bernoulli populations, and know CT.

    free ebook download

    Computer programs: GitHub

    Seeking feedback.

    submitted by /u/myleetw
    [link] [comments]

    No comments:

    Post a Comment