• Breaking News

    Saturday, July 31, 2021

    Invariant-preserving binary encoding of graph transformation rules Computer Science

    Invariant-preserving binary encoding of graph transformation rules Computer Science


    Invariant-preserving binary encoding of graph transformation rules

    Posted: 30 Jul 2021 08:33 PM PDT

    So first a bit of background to clarify what I mean (the graph transformation notation below is the same as used by Wolfram.)

    Graphs can be seen as sets of relations, so {{1, 2}, {1, 3}, {3, 2}} is a graph with nodes 1, 2 and 3, and edges 1->2, 1->3 and 3->2. Note that relations can be hyperedges like {1, 2, 3}, meaning an edge 1->2->3.

    Graph transformation rules can be defined so that they have a pattern of relations they match, and a pattern that they produce: with that example graph, applying the rule {{a, b}, {a, c}} -> {{b, a}, {a, d}, {c, d}} would match so that the variables would bind to a=1, b=2, c=3, and the free variable d would be a new node with the next free ID 4, and it would produce the new graph {{2, 1}, {1, 4}, {3, 4}, {3, 2}}.

    Rules have an arity, and since hyperedges (edges with more than one node) are allowed this arity can be mixed: eg. {{a, b}, {c, d, a}} -> {{a, d, b}, {d, c, a}} has an arity of 1₂1₃ -> 2₃

    To guarantee that the produced graphs are always at least weakly connected, the following invariants must hold for all rules:

    1. each side of the rule has to describe at least a weakly connected graph, so eg. {{a, b}, {c, d}} -> … is not allowed
    2. the right hand side must use all variables present on the left hand side

    Now, my question is this: how could I come up with a binary representation for rules that works with an arbitrary but predefined arity (ie. the arity depends on configuration parameters and won't change during the decoding), so that any binary string of some maximum length will always encode a correct rule that preserves those invariants? This would naturally mean that multiple strings could decode to the same rule, i.e. it'd be a surjective mapping. What I don't want is to have to reject encoded strings if they don't produce correct rules; all strings must result in a valid transformation.

    Not looking to be handed an answer, any ideas or pointers are welcome.

    submitted by /u/physicomorphic
    [link] [comments]

    [Question] Selecting numbers randomly from a set (without repetition) until all numbers have been selected

    Posted: 15 Jul 2021 02:09 PM PDT

    So, I was wondering whether there are any other non-obvious methods for doing this that require less memory.

    Specifically, given a list of numbers 1..n, pick numbers randomly from that list (without repetition) until all numbers have been selected.

    Whenever I have encountered this problem, I have generated a list L containing numbers 1..n, then either

    • Permuted L, then picked from L in order 0..n-1

    • Set m = L.size. Selected a random index x from the set 0..m-1. Swapped L[x] with L[m-1], then returned L[m-1], and set m=m-1. Continune until m=0.

    However, I am wondering whether there are any other more nuanced tricks that I have not encountered that don't require an entire list of numbers to be generated.

    Sorry if this is the wrong place for this question.

    EDIT: What I'm really interested in is saving space. Specifically, rather than having to store the entire list of n elements, which may not be feasible if n is really large, are there any other space-saving tricks that can be done?

    For example, off the top of my head, I've had the idea of an algorithm that works as follows; we start with an empty list, and pick the elements at random and store them as they are selected. When we have selected n/2 elements, we go through the list, pick the elements that haven't been selected yet, and put them into a list that we can then probe as in the original algorithm. Saves memory, at the expense of some computational complexity.

    submitted by /u/abinett
    [link] [comments]

    No comments:

    Post a Comment