• Breaking News

    Thursday, April 16, 2020

    Help me understand VPNs? Computer Science

    Help me understand VPNs? Computer Science


    Help me understand VPNs?

    Posted: 15 Apr 2020 07:17 PM PDT

    So when I connect to a network using a VPN, it takes my data which is wrapped in a TCP header, and re-wraps the whole packet into a UDP header.... What I don't understand is why does it get wrapped in another header at all? Why can I not just connect to my network using normal TCP packets and a password to access network drives? I am not confused about using UDP specifically, I know that wrapping a TCP packet inside of another TCP packet causes problems, but why wrap it at all?

    submitted by /u/Crusader899
    [link] [comments]

    Roadmap to Computer Vision

    Posted: 16 Apr 2020 04:11 AM PDT

    An introduction to the main steps which compose a computer vision system. Starting from how images are pre-processed, features extracted and predictions are made.

    https://towardsdatascience.com/roadmap-to-computer-vision-79106beb8be4

    submitted by /u/zlatanmunutd10
    [link] [comments]

    (xpost /r/haskell) I wrote an introduction to lambda calculus

    Posted: 15 Apr 2020 06:23 AM PDT

    CMU, DeepMind & Google’s XTREME Benchmarks Multilingual Model Generalization Across 40 Languages

    Posted: 15 Apr 2020 11:51 AM PDT

    Junjie Hu is a PhD student at the Language Technologies Institute of Carnegie Mellon University. Hu previously interned at Google, where he observed it was difficult to train cross-lingual models due to a lack of established and comprehensive tasks and environments for evaluating and comparing model performance on cross-lingual generalization.

    Although recent multilingual approaches like mBERT and XLM have shown impressive results in learning general-purpose multilingual representations, a fair comparison between these models remains difficult as most evaluations focus on different sets of tasks designed for similar languages.

    Hu and another CMU researcher together with DeepMind's Sebastian Ruder and researchers from Google recently published a study introducing XTREME, a multi-task benchmark that evaluates cross-lingual generalization capabilities of multilingual representations across 40 languages and nine tasks. Hu told Synced "Hopefully XTREME can encourage more research efforts in building multilingual NLP models and effective human curations for multilingual resources."

    Here is a quick read: CMU, DeepMind & Google's XTREME Benchmarks Multilingual Model Generalization Across 40 Languages

    Read the original paper here. Find the code here.

    submitted by /u/Yuqing7
    [link] [comments]

    Can yacc be used to generate compiler for L-attributed LR(1) grammar?

    Posted: 15 Apr 2020 08:55 AM PDT

    I have read that yacc generates bottom up parser for LALR(1) grammars. I have a grammar for Java 1 that can be used for generating three address code and is strictly LALR(1), but the translation scheme I am employing makes it L-attributed. Now I have read that L-attributed LR grammars cannot be parsed by using bottom up parsing. So, can yacc be used here or not?

    submitted by /u/homohierarchicus
    [link] [comments]

    Does such kind of data structure exist?

    Posted: 15 Apr 2020 06:23 AM PDT

    Wondering if there is a data structure which has the best of both world of array and linked list, i.e. O(logn) for search, O(1) for insert...does such a thing exist!?

    submitted by /u/DudeKeepTrying
    [link] [comments]

    No comments:

    Post a Comment