Description

The LINC assembly language allows to combine the basic operation of the LINC protocol '''rd()''', '''get()''' and '''put()''' to manipulate the resources of the system.

Let consider the following scripts extracted from the tutorial

{* ,!}["Test", "Customer"].rd(name, departure) &
{* ,!}["Test", "Destination"].rd(name ,arrival) &
 {* ,!}["Test", "Connection"].rd(departure, arrival) &
 ::
 {
 ["Test", "Customer"].get(name, departure);
 ["Test", "Destination".get(name, arrival);
 ["Test", "Connection"].rd(departure, arrival)  ;
 ["Test", "Customer"].put(name, arrival);
 ["Test", "Travel"].put(departure, arrival, name);
 }

A LINC rule is before all a production rule with :

  • a precondition part that corresponds to the resources we are interested in
{* ,!}["Test", "Customer"].rd(name, departure) &
 {* ,!}["Test", "Destination"].rd(name ,arrival) &
 {* ,!}["Test", "Connection"].rd(departure, arrival) &
  • a separator

::

  • a performance part that corresponds to the operations we want to realize on the resources found in the precondition part. Operations enclosed into curly brackets are performed respecting the all-or-nothing property forming a transaction. Several transactions can be defined. In this case the transactions are executed sequentially. A rule MUST ends with a .
{
 ["Test", "Customer"].get(name, departure);
 ["Test", "Destination".get(name, arrival);
 ["Test", "Connection"].rd(departure, arrival)    ;
 ["Test", "Customer"].put(name, arrival);
 ["Test", "Travel"].put(departure, arrival, name);
 }
 {
 ["Test", "Customer"].rd(name, departure);
 ["Test", "Destination".rd(name, arrival);
 ["Test", "Travel"].put(departure, arrival, "failed");
 }.

precondition part

The precondition part is constituted of tokens. We want to find combination of resources corresponding to the masks for all the involved bags.

Practically tokens may appear in any order, the system would be able to invoke them in a correct order due to the dependencies introduced by the propagation of the instantiated variable. Practically, it would need some extra process on the compiler point of view while when you write a rule you naturally write the token in the correct order. Then, we rely on you and we suppose that you ordered correctly the tokens. If it is not the case, you will have a run-time error ;-)

We promise that automatic ordering will be done one day but currently this is not our priority and we are sure that if we did not warn you about that you would not have noticed it ;-)

A token of the precondition part is constituted of 3 parts

location of the bag

["Test","Customer"]

Here we want to reach the bag Customer of the object Test The interesting aspect is that both for the object name and the bag name we can use variable which are instantiated by preceding token in the rule (See ... for an example)

modifier for the resource stream management

{*,!}

First element is the number of expected replies. It could be: * n (a positive integer) * ***** for infinite.

Second element is the time (in seconds) that the system will wait when nothing is available. It could be: * t (a positive integer), * ! for infinite

On an internal point of view a closestream() is sent to the stream when the number of reply is reached or the timeout is elapsed.

the operation read()

.rd

the requested mask

("alice",departure)

The fields of the mask can can be: * a constant string noted "alice" * a variable which has been instantiated by preceding token in the rule * a variable for which we want a returned value from the bag

Extensions for the precondition part

In addition to tokens accessing resources, the LINC assembly language contains several extensions that makes its expression more powerful and your life easier.

ASSERT, COMPUTE

ASSERT: user_lib.assert_function("alice", var) &
COMPUTE: ret_var1, ret_var2 = user_lib.compute_function("alice", var) &

These tokens call the python functions assert_function and compute_function in the module user_lib. user_lib MUST be in your python path or in the directory from where you're calling the quinoa.

Assert and compute functions may take any number of parameters, the only rule is that every parameter MUST be a string.

A parameter can be:

  • a constant string noted "alice"
  • a variable which has been instantiated by preceding token in the rule

ASSERT

Assert functions MUST return a boolean.

An exemple of an assert_funtion is:

def are_different(a, b):
    return a != b

COMPUTE

Compute functions MUST return a tuple of strings.

An exemple of a compute_funtion is:

def concat(a, b):
    return a + "_" + b
def split(a):
    return a.split("_", 2)

The values returned by a compute function can be used as variables by the following tokens or in the performance part.

INLINE_ASSERT, INLINE_COMPUTE

INLINE_ASSERT: name1 != name2 &
INLINE_COMPUTE: name3 = name1 + name2 &

These tokens allows you to inline any valid python code.

INLINE_ASSERT

Inline python code is expected between : and & (or :: if the token is the last of the precondition)

This inline python code MUST return a boolean

INLINE_COMPUTE

Inline python code is expected between = and & (or :: if the token is the last of the precondition)

Thes inline python code MUST return a tuple of strings.

The values returned by an inline python code can be used as variables by the following tokens or in the performance part.

examples of valid tokens:

INLINE_COMPUTE: res = str(int(var) + 2) &
INLINE_COMPUTE: res1, res2 = str(int(var) + 2), "toto" &
INLINE_COMPUTE: res = value" if var == "objet1" else "value2" &

SLEEP

SLEEP: 10 &

When this token is encountered, the coordinator waits the number of seconds (here 10) before evaluating the next token. (Note that in a previous version SLEEP was called TIMEOUT, if you encounter a TIMEOUT in a rule it has the same meaning as the SLEEP, the name (SLEEP) is just more simply meaningful. Note that you MUST use SLEEP since the support of the TIMEOUT keyword is now deprecated).

WAIT_UNTIL

WAIT_UNTIL: 2011-12-21 06:06:06 &

When this token is encountered, the coordinator waits until the given date (here 21st of December 2012 at 06:06:06) before evaluating the next token. If the date is in the past the next token is evaluated immediately.

performance part

The performance part is also constituted of tokens. As already said the performance part executes with an all-or-nothing semantic all the block enclosed in { }. We can have several of them that define several groups of operations that need this atomicity of performance. There are no dependencies between the blocks and one may succeed while another may fail but inside a block the all-or-nothing is preserved. Moreover, the performance inside a block are guaranteed to be performed in the order they appear in the rule. This means that the operation corresponding to these tokens will be done in the specified order. This is very useful if the considered bag have some internal dependencies. (See description of the Visual object as an example).

A token is constituted of 2 parts

location of the bag

["Test","Customer"]

Here we want to reach the bag Customer of the object Test The interesting aspect is that both for the object name and the bag name we can use variable which are instantiated by preceding token in the rule.

the operations to perform

.rd
.get
.put

resource to be considered

("alice",departure)

The fields of the mask can can be: * a constant string noted "alice" * a variable which has been instantiated by preceding token in the rule