Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Operational semantics
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Small-step semantics === ==== Structural operational semantics ==== '''Structural operational semantics''' (SOS, also called '''structured operational semantics''' or '''small-step semantics''') was introduced by [[Gordon Plotkin]] in ([[#plotkin81|Plotkin81]]) as a logical means to define operational semantics. The basic idea behind SOS is to define the behavior of a program in terms of the behavior of its parts, thus providing a structural, i.e., syntax-oriented and [[inductive definition|inductive]], view on operational semantics. An SOS specification defines the behavior of a program in terms of a (set of) [[State transition system|transition relation]](s). SOS specifications take the form of a set of [[inference rule]]s that define the valid transitions of a composite piece of syntax in terms of the transitions of its components. For a simple example, we consider part of the semantics of a simple programming language; proper illustrations are given in [[#plotkin81|Plotkin81]] and [[#hennessybook|Hennessy90]], and other textbooks. Let <math>C_1, C_2</math> range over programs of the language, and let <math>s</math> range over states (e.g. functions from memory locations to values). If we have expressions (ranged over by <math>E</math>), values {{nobreak|(<math>V</math>)}} and locations (<math>L</math>), then a memory update command would have semantics: <math> \frac{\langle E,s\rangle \Rightarrow V}{\langle L:=E\,,\,s\rangle\longrightarrow (s\uplus (L\mapsto V))} </math> <!-- NOTE: This is only a fragment of a full semantics. I've tried to include enough to illustrate the points but not so much that it takes a disproportionate amount of space. --> Informally, the rule says that "'''if''' the expression <math>E</math> in state <math>s</math> reduces to value <math>V</math>, '''then''' the program <math>L:=E</math> will update the state <math>s</math> with the assignment <math>L=V</math>". The semantics of sequencing can be given by the following three rules: <math> \frac{\langle C_1,s\rangle \longrightarrow s'} {\langle C_1;C_2 \,,s\rangle\longrightarrow \langle C_2, s'\rangle} \quad\quad \frac{\langle C_1,s\rangle \longrightarrow \langle C_1',s'\rangle} {\langle C_1;C_2 \,,s\rangle\longrightarrow \langle C_1';C_2\,, s'\rangle} \quad\quad \frac{} {\langle \mathbf{skip} ,s\rangle\longrightarrow s} </math> Informally, the first rule says that, if program <math>C_1</math> in state <math>s</math> finishes in state <math>s'</math>, then the program <math>C_1;C_2</math> in state <math>s</math> will reduce to the program <math>C_2</math> in state <math>s'</math>. (You can think of this as formalizing "You can run <math>C_1</math>, and then run <math>C_2</math> using the resulting memory store.) The second rule says that if the program <math>C_1</math> in state <math>s</math> can reduce to the program <math>C_1'</math> with state <math>s'</math>, then the program <math>C_1;C_2</math> in state <math>s</math> will reduce to the program <math>C_1';C_2</math> in state <math>s'</math>. (You can think of this as formalizing the principle for an optimizing compiler: "You are allowed to transform <math>C_1</math> as if it were stand-alone, even if it is just the first part of a program.") The semantics is structural, because the meaning of the sequential program <math>C_1;C_2</math>, is defined by the meaning of <math>C_1</math> and the meaning of <math>C_2</math>. If we also have Boolean expressions over the state, ranged over by <math>B</math>, then we can define the semantics of the '''while''' command: <math> \frac{\langle B,s\rangle \Rightarrow \mathbf{true}}{\langle\mathbf{while}\ B\ \mathbf{ do }\ C,s\rangle\longrightarrow \langle C;\mathbf{while}\ B\ \mathbf{do}\ C,s\rangle} \quad \frac{\langle B,s\rangle \Rightarrow \mathbf{false}}{\langle\mathbf{while}\ B\ \mathbf{ do }\ C,s\rangle\longrightarrow s} </math> Such a definition allows formal analysis of the behavior of programs, permitting the study of [[Relation (mathematics)|relations]] between programs. Important relations include [[simulation preorder]]s and [[bisimulation]]. These are especially useful in the context of [[Concurrency (computer science)|concurrency theory]]. Thanks to its intuitive look and easy-to-follow structure, SOS has gained great popularity and has become a de facto standard in defining operational semantics. As a sign of success, the original report (so-called Aarhus report) on SOS ([[#plotkin81|Plotkin81]]) has attracted more than 1000 citations according to the CiteSeer [http://citeseer.ist.psu.edu/673965.html], making it one of the most cited technical reports in [[Computer Science]]. ==== Reduction semantics ==== '''Reduction semantics''' is an alternative presentation of operational semantics. Its key ideas were first applied to purely functional [[call by name]] and [[call by value]] variants of the [[lambda calculus]] by [[Gordon Plotkin]] in 1975<ref>{{cite journal|last=Plotkin|first=Gordon|date=1975|title=Call-by-name, call-by-value and the λ-calculus|journal=Theoretical Computer Science|volume=1|issue=2|pages=125–159|doi=10.1016/0304-3975(75)90017-1|url=https://www.sciencedirect.com/science/article/pii/0304397575900171/pdf?md5=db2e67c1ada7163a28f124934b483f3a&pid=1-s2.0-0304397575900171-main.pdf|access-date=July 22, 2021|doi-access=free}}</ref> and generalized to higher-order functional languages with imperative features by [[Matthias Felleisen]] in his 1987 dissertation.<ref>{{cite thesis|type=PhD|last=Felleisen|first=Matthias|date=1987|title=The calculi of Lambda-v-CS conversion: a syntactic theory of control and state in imperative higher-order programming languages|publisher=Indiana University|url=https://www2.ccs.neu.edu/racket/pubs/dissertation-felleisen.pdf|access-date=July 22, 2021}}</ref> The method was further elaborated by Matthias Felleisen and Robert Hieb in 1992 into a fully [[equational theory]] for [[control flow|control]] and [[program state|state]].<ref name="felleisen-hieb-92" /> The phrase “reduction semantics” itself was first coined by Felleisen and [[Daniel P. Friedman]] in a PARLE 1987 paper.<ref>{{cite conference|last1=Felleisen|first1=Matthias|last2=Friedman|first2=Daniel P.|date=1987|title=A Reduction Semantics for Imperative Higher-Order Languages|book-title=Proceedings of the Parallel Architectures and Languages Europe|volume=1|pages=206–223|conference=International Conference on Parallel Architectures and Languages Europe|publisher=Springer-Verlag|doi=10.1007/3-540-17945-3_12}}</ref> Reduction semantics are given as a set of ''reduction rules'' that each specify a single potential reduction step. For example, the following reduction rule states that an assignment statement can be reduced if it sits immediately beside its variable declaration: <math>\mathbf{let\ rec}\ x = v_1\ \mathbf{in}\ x \leftarrow v_2;\ e\ \ \longrightarrow\ \ \mathbf{let\ rec}\ x = v_2\ \mathbf{in}\ e</math> To get an assignment statement into such a position it is “bubbled up” through function applications and the right-hand side of assignment statements until it reaches the proper point. Since intervening <math>\mathbf{let}</math> expressions may declare distinct variables, the calculus also demands an extrusion rule for <math>\mathbf{let}</math> expressions. Most published uses of reduction semantics define such “bubble rules” with the convenience of evaluation contexts. For example, the grammar of evaluation contexts in a simple [[call by value]] language can be given as <math> E ::= [\,]\ \big|\ v\ E\ \big|\ E\ e\ \big|\ x \leftarrow E \ \big|\ \mathbf{let\ rec}\ x = v\ \mathbf{in}\ E\ \big|\ E;\ e </math> where <math>e</math> denotes arbitrary expressions and <math>v</math> denotes fully-reduced values. Each evaluation context includes exactly one hole <math>[\,]</math> into which a term is plugged in a capturing fashion. The shape of the context indicates with this hole where reduction may occur. To describe “bubbling” with the aid of evaluation contexts, a single axiom suffices: <math>E[\,x \leftarrow v;\ e\,]\ \ \longrightarrow\ \ x \leftarrow v;\ E[\,e\,] \qquad \text{(lift assignments)}</math> This single reduction rule is the lift rule from Felleisen and Hieb's lambda calculus for assignment statements. The evaluation contexts restrict this rule to certain terms, but it is freely applicable in any term, including under lambdas. Following Plotkin, showing the usefulness of a calculus derived from a set of reduction rules demands (1) a Church-Rosser lemma for the single-step relation, which induces an evaluation function, and (2) a Curry-Feys standardization lemma for the transitive-reflexive closure of the single-step relation, which replaces the non-deterministic search in the evaluation function with a deterministic left-most/outermost search. Felleisen showed that imperative extensions of this calculus satisfy these theorems. Consequences of these theorems are that the equational theory—the symmetric-transitive-reflexive closure—is a sound reasoning principle for these languages. However, in practice, most applications of reduction semantics dispense with the calculus and use the standard reduction only (and the evaluator that can be derived from it). Reduction semantics are particularly useful given the ease by which evaluation contexts can model state or unusual control constructs (e.g., [[first-class continuations]]). In addition, reduction semantics have been used to model [[object-oriented]] languages,<ref>{{cite book|title=A Theory of Objects|last1=Abadi|first1=M.|last2=Cardelli|first2=L.|date=8 September 2012|publisher=Springer |isbn=9781441985989|url=https://books.google.com/books?id=AgzSBwAAQBAJ&q=%22operational+semantics%22}}</ref> [[design by contract|contract systems]], exceptions, futures, call-by-need, and many other language features. A thorough, modern treatment of reduction semantics that discusses several such applications at length is given by Matthias Felleisen, Robert Bruce Findler and Matthew Flatt in ''Semantics Engineering with PLT Redex''.<ref>{{cite book|last1=Felleisen|first1=Matthias|last2=Findler|first2=Robert Bruce|last3=Flatt|first3=Matthew|title=Semantics Engineering with PLT Redex|year=2009|publisher=The MIT Press|isbn=978-0-262-06275-6|url=https://mitpress.mit.edu/9780262062756/semantics-engineering-with-plt-redex/}}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)