Syntax error async with outside async function

Hello I am using very simple code on raspberry pi using python 3 as below import aiohttp import pysmartthings token = 'xxxxxxx' async with aiohttp.ClientSession() as session: api = pysmarth...

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and
privacy statement. We’ll occasionally send you account related emails.

Already on GitHub?
Sign in
to your account


Closed

adnan2911 opened this issue

Jan 1, 2020

· 3 comments

Comments

@adnan2911

Hello
I am using very simple code on raspberry pi using python 3 as below

import aiohttp
import pysmartthings

token = 'xxxxxxx'

async with aiohttp.ClientSession() as session:
        api = pysmarthings.SmartThings(session,token)
        devices = await api.devices()
        print(len(devices))

but i am getting below error
SyntaxError: ‘async with’ outside async function

Kindly please help
Thanks
Adnan

@posborne

Hi @adnan2911, you are correct that the snippet is not complete as provided. With python asyncio you will need to put that fragment inside an async function. You will also need to run the async function on a reactor. Here’s a complete example you can play with:

#!/usr/bin/env python3
import aiohttp
import asyncio
import pysmartthings

token = '...'

async def print_devices():
    async with aiohttp.ClientSession() as session:
        api = pysmartthings.SmartThings(session, token)
        devices = await api.devices()
        for device in devices:
            print("{}: {}".format(device.device_id, device.label))


def main():
    loop = asyncio.get_event_loop()
    loop.run_until_complete(print_devices())
    loop.close()

if __name__ == '__main__':
    main()
riemino, th3jesta, guiambros, Fidelity88, bwholman, iamdottim, AdityaHPatwardhan, Behrad96, mann-brinson, PurushothamanSrikanth, and 6 more reacted with thumbs up emoji
pcko1, mann-brinson, alfeuduran, jeffgoh, zackfair1, dimitrisnikolaou10, and james988 reacted with heart emoji
pcko1 and shivam05011996 reacted with rocket emoji

@adnan2911

Thanks alot Paul, It worked with above code

Regards,
Adnan

@FelixAsch

How can I change the mediaInputSource on my SmartTV?
Im new in programming with smartthings, so i don’t get your doku.

Regards,
Felix

Author:
Yury Selivanov <yury at edgedb.com>
Discussions-To:
Python-Dev list
Status:
Final
Type:
Standards Track
Created:
09-Apr-2015
Python-Version:
3.5
Post-History:
17-Apr-2015, 21-Apr-2015, 27-Apr-2015, 29-Apr-2015, 05-May-2015

Table of Contents

  • Abstract
  • API Design and Implementation Revisions
  • Rationale and Goals
  • Specification
    • New Coroutine Declaration Syntax
    • types.coroutine()
    • Await Expression
      • Updated operator precedence table
      • Examples of “await” expressions
    • Asynchronous Context Managers and “async with”
      • New Syntax
      • Example
    • Asynchronous Iterators and “async for”
      • New Syntax
      • Example 1
      • Example 2
      • Why StopAsyncIteration?
    • Coroutine objects
      • Differences from generators
      • Coroutine object methods
    • Debugging Features
    • New Standard Library Functions
    • New Abstract Base Classes
  • Glossary
  • Transition Plan
    • Backwards Compatibility
      • asyncio
      • asyncio migration strategy
      • async/await in CPython code base
    • Grammar Updates
    • Deprecation Plans
  • Design Considerations
    • PEP 3152
    • Coroutine-generators
    • Why “async” and “await” keywords
    • Why “__aiter__” does not return an awaitable
    • Importance of “async” keyword
    • Why “async def”
    • Why not “await for” and “await with”
    • Why “async def” and not “def async”
    • Why not a __future__ import
    • Why magic methods start with “a”
    • Why not reuse existing magic names
    • Why not reuse existing “for” and “with” statements
    • Comprehensions
    • Async lambda functions
  • Performance
    • Overall Impact
    • Tokenizer modifications
    • async/await
  • Reference Implementation
    • List of high-level changes and new protocols
    • Working example
  • Acceptance
  • Implementation
  • References
  • Acknowledgments
  • Copyright

Abstract

The growth of Internet and general connectivity has triggered the
proportionate need for responsive and scalable code. This proposal
aims to answer that need by making writing explicitly asynchronous,
concurrent Python code easier and more Pythonic.

It is proposed to make coroutines a proper standalone concept in
Python, and introduce new supporting syntax. The ultimate goal
is to help establish a common, easily approachable, mental
model of asynchronous programming in Python and make it as close to
synchronous programming as possible.

This PEP assumes that the asynchronous tasks are scheduled and
coordinated by an Event Loop similar to that of stdlib module
asyncio.events.AbstractEventLoop. While the PEP is not tied to any
specific Event Loop implementation, it is relevant only to the kind of
coroutine that uses yield as a signal to the scheduler, indicating
that the coroutine will be waiting until an event (such as IO) is
completed.

We believe that the changes proposed here will help keep Python
relevant and competitive in a quickly growing area of asynchronous
programming, as many other languages have adopted, or are planning to
adopt, similar features: [2], [5], [6], [7], [8], [10].

API Design and Implementation Revisions

  1. Feedback on the initial beta release of Python 3.5 resulted in a
    redesign of the object model supporting this PEP to more clearly
    separate native coroutines from generators — rather than being a
    new kind of generator, native coroutines are now their own
    completely distinct type (implemented in [17]).

    This change was implemented based primarily due to problems
    encountered attempting to integrate support for native coroutines
    into the Tornado web server (reported in [18]).

  2. In CPython 3.5.2, the __aiter__ protocol was updated.

    Before 3.5.2, __aiter__ was expected to return an awaitable
    resolving to an asynchronous iterator. Starting with 3.5.2,
    __aiter__ should return asynchronous iterators directly.

    If the old protocol is used in 3.5.2, Python will raise a
    PendingDeprecationWarning.

    In CPython 3.6, the old __aiter__ protocol will still be
    supported with a DeprecationWarning being raised.

    In CPython 3.7, the old __aiter__ protocol will no longer be
    supported: a RuntimeError will be raised if __aiter__
    returns anything but an asynchronous iterator.

    See [19] and [20] for more details.

Rationale and Goals

Current Python supports implementing coroutines via generators (PEP
342), further enhanced by the yield from syntax introduced in PEP
380. This approach has a number of shortcomings:

  • It is easy to confuse coroutines with regular generators, since they
    share the same syntax; this is especially true for new developers.
  • Whether or not a function is a coroutine is determined by a presence
    of yield or yield from statements in its body, which can
    lead to unobvious errors when such statements appear in or disappear
    from function body during refactoring.
  • Support for asynchronous calls is limited to expressions where
    yield is allowed syntactically, limiting the usefulness of
    syntactic features, such as with and for statements.

This proposal makes coroutines a native Python language feature, and
clearly separates them from generators. This removes
generator/coroutine ambiguity, and makes it possible to reliably define
coroutines without reliance on a specific library. This also enables
linters and IDEs to improve static code analysis and refactoring.

Native coroutines and the associated new syntax features make it
possible to define context manager and iteration protocols in
asynchronous terms. As shown later in this proposal, the new async
with
statement lets Python programs perform asynchronous calls when
entering and exiting a runtime context, and the new async for
statement makes it possible to perform asynchronous calls in iterators.

Specification

This proposal introduces new syntax and semantics to enhance coroutine
support in Python.

This specification presumes knowledge of the implementation of
coroutines in Python (PEP 342 and PEP 380). Motivation for the syntax
changes proposed here comes from the asyncio framework (PEP 3156) and
the “Cofunctions” proposal (PEP 3152, now rejected in favor of this
specification).

From this point in this document we use the word native coroutine to
refer to functions declared using the new syntax. generator-based
coroutine
is used where necessary to refer to coroutines that are
based on generator syntax. coroutine is used in contexts where both
definitions are applicable.

New Coroutine Declaration Syntax

The following new syntax is used to declare a native coroutine:

async def read_data(db):
    pass

Key properties of coroutines:

  • async def functions are always coroutines, even if they do not
    contain await expressions.
  • It is a SyntaxError to have yield or yield from
    expressions in an async function.
  • Internally, two new code object flags were introduced:
    • CO_COROUTINE is used to mark native coroutines
      (defined with new syntax).
    • CO_ITERABLE_COROUTINE is used to make generator-based
      coroutines
      compatible with native coroutines (set by
      types.coroutine() function).
  • Regular generators, when called, return a generator object;
    similarly, coroutines return a coroutine object.
  • StopIteration exceptions are not propagated out of coroutines,
    and are replaced with a RuntimeError. For regular generators
    such behavior requires a future import (see PEP 479).
  • When a native coroutine is garbage collected, a RuntimeWarning
    is raised if it was never awaited on (see also
    Debugging Features).
  • See also Coroutine objects section.

types.coroutine()

A new function coroutine(fn) is added to the types module. It
allows interoperability between existing generator-based coroutines
in asyncio and native coroutines introduced by this PEP:

@types.coroutine
def process_data(db):
    data = yield from read_data(db)
    ...

The function applies CO_ITERABLE_COROUTINE flag to generator-
function’s code object, making it return a coroutine object.

If fn is not a generator function, it is wrapped. If it returns
a generator, it will be wrapped in an awaitable proxy object
(see below the definition of awaitable objects).

Note, that the CO_COROUTINE flag is not applied by
types.coroutine() to make it possible to separate native
coroutines
defined with new syntax, from generator-based coroutines.

Await Expression

The following new await expression is used to obtain a result of
coroutine execution:

async def read_data(db):
    data = await db.fetch('SELECT ...')
    ...

await, similarly to yield from, suspends execution of
read_data coroutine until db.fetch awaitable completes and
returns the result data.

It uses the yield from implementation with an extra step of
validating its argument. await only accepts an awaitable, which
can be one of:

  • A native coroutine object returned from a native coroutine
    function
    .
  • A generator-based coroutine object returned from a function
    decorated with types.coroutine().
  • An object with an __await__ method returning an iterator.

    Any yield from chain of calls ends with a yield. This is a
    fundamental mechanism of how Futures are implemented. Since,
    internally, coroutines are a special kind of generators, every
    await is suspended by a yield somewhere down the chain of
    await calls (please refer to PEP 3156 for a detailed
    explanation).

    To enable this behavior for coroutines, a new magic method called
    __await__ is added. In asyncio, for instance, to enable Future
    objects in await statements, the only change is to add
    __await__ = __iter__ line to asyncio.Future class.

    Objects with __await__ method are called Future-like objects in
    the rest of this PEP.

    It is a TypeError if __await__ returns anything but an
    iterator.

  • Objects defined with CPython C API with a tp_as_async.am_await
    function, returning an iterator (similar to __await__ method).

It is a SyntaxError to use await outside of an async def
function (like it is a SyntaxError to use yield outside of
def function).

It is a TypeError to pass anything other than an awaitable object
to an await expression.

Updated operator precedence table

await keyword is defined as follows:

power ::=  await ["**" u_expr]
await ::=  ["await"] primary

where “primary” represents the most tightly bound operations of the
language. Its syntax is:

primary ::=  atom | attributeref | subscription | slicing | call

See Python Documentation [12] and Grammar Updates section of this
proposal for details.

The key await difference from yield and yield from
operators is that await expressions do not require parentheses around
them most of the times.

Also, yield from allows any expression as its argument, including
expressions like yield from a() + b(), that would be parsed as
yield from (a() + b()), which is almost always a bug. In general,
the result of any arithmetic operation is not an awaitable object.
To avoid this kind of mistakes, it was decided to make await
precedence lower than [], (), and ., but higher than **
operators.

Operator Description
yield x,
yield from x
Yield expression
lambda Lambda expression
ifelse Conditional expression
or Boolean OR
and Boolean AND
not x Boolean NOT
in, not in,
is, is not, <,
<=, >, >=,
!=, ==
Comparisons, including membership
tests and identity tests
| Bitwise OR
^ Bitwise XOR
& Bitwise AND
<<, >> Shifts
+, - Addition and subtraction
*, @, /, //,
%
Multiplication, matrix
multiplication, division,
remainder
+x, -x, ~x Positive, negative, bitwise NOT
** Exponentiation
await x Await expression
x[index],
x[index:index],
x(arguments...),
x.attribute
Subscription, slicing,
call, attribute reference
(expressions...),
[expressions...],
{key: value...},
{expressions...}
Binding or tuple display,
list display,
dictionary display,
set display

Examples of “await” expressions

Valid syntax examples:

Expression Will be parsed as
if await fut: pass if (await fut): pass
if await fut + 1: pass if (await fut) + 1: pass
pair = await fut, 'spam' pair = (await fut), 'spam'
with await fut, open(): pass with (await fut), open(): pass
await foo()['spam'].baz()() await ( foo()['spam'].baz()() )
return await coro() return ( await coro() )
res = await coro() ** 2 res = (await coro()) ** 2
func(a1=await coro(), a2=0) func(a1=(await coro()), a2=0)
await foo() + await bar() (await foo()) + (await bar())
-await foo() -(await foo())

Invalid syntax examples:

Expression Should be written as
await await coro() await (await coro())
await -coro() await (-coro())

Asynchronous Context Managers and “async with”

An asynchronous context manager is a context manager that is able to
suspend execution in its enter and exit methods.

To make this possible, a new protocol for asynchronous context managers
is proposed. Two new magic methods are added: __aenter__ and
__aexit__. Both must return an awaitable.

An example of an asynchronous context manager:

class AsyncContextManager:
    async def __aenter__(self):
        await log('entering context')

    async def __aexit__(self, exc_type, exc, tb):
        await log('exiting context')

New Syntax

A new statement for asynchronous context managers is proposed:

async with EXPR as VAR:
    BLOCK

which is semantically equivalent to:

mgr = (EXPR)
aexit = type(mgr).__aexit__
aenter = type(mgr).__aenter__

VAR = await aenter(mgr)
try:
    BLOCK
except:
    if not await aexit(mgr, *sys.exc_info()):
        raise
else:
    await aexit(mgr, None, None, None)

As with regular with statements, it is possible to specify multiple
context managers in a single async with statement.

It is an error to pass a regular context manager without __aenter__
and __aexit__ methods to async with. It is a SyntaxError
to use async with outside of an async def function.

Example

With asynchronous context managers it is easy to implement proper
database transaction managers for coroutines:

async def commit(session, data):
    ...

    async with session.transaction():
        ...
        await session.update(data)
        ...

Code that needs locking also looks lighter:

instead of:

with (yield from lock):
    ...

Asynchronous Iterators and “async for”

An asynchronous iterable is able to call asynchronous code in its
iter implementation, and asynchronous iterator can call
asynchronous code in its next method. To support asynchronous
iteration:

  1. An object must implement an __aiter__ method (or, if defined
    with CPython C API, tp_as_async.am_aiter slot) returning an
    asynchronous iterator object.
  2. An asynchronous iterator object must implement an __anext__
    method (or, if defined with CPython C API, tp_as_async.am_anext
    slot) returning an awaitable.
  3. To stop iteration __anext__ must raise a StopAsyncIteration
    exception.

An example of asynchronous iterable:

class AsyncIterable:
    def __aiter__(self):
        return self

    async def __anext__(self):
        data = await self.fetch_data()
        if data:
            return data
        else:
            raise StopAsyncIteration

    async def fetch_data(self):
        ...

New Syntax

A new statement for iterating through asynchronous iterators is
proposed:

async for TARGET in ITER:
    BLOCK
else:
    BLOCK2

which is semantically equivalent to:

iter = (ITER)
iter = type(iter).__aiter__(iter)
running = True
while running:
    try:
        TARGET = await type(iter).__anext__(iter)
    except StopAsyncIteration:
        running = False
    else:
        BLOCK
else:
    BLOCK2

It is a TypeError to pass a regular iterable without __aiter__
method to async for. It is a SyntaxError to use async for
outside of an async def function.

As for with regular for statement, async for has an optional
else clause.

Example 1

With asynchronous iteration protocol it is possible to asynchronously
buffer data during iteration:

async for data in cursor:
    ...

Where cursor is an asynchronous iterator that prefetches N rows
of data from a database after every N iterations.

The following code illustrates new asynchronous iteration protocol:

class Cursor:
    def __init__(self):
        self.buffer = collections.deque()

    async def _prefetch(self):
        ...

    def __aiter__(self):
        return self

    async def __anext__(self):
        if not self.buffer:
            self.buffer = await self._prefetch()
            if not self.buffer:
                raise StopAsyncIteration
        return self.buffer.popleft()

then the Cursor class can be used as follows:

async for row in Cursor():
    print(row)

which would be equivalent to the following code:

i = Cursor().__aiter__()
while True:
    try:
        row = await i.__anext__()
    except StopAsyncIteration:
        break
    else:
        print(row)

Example 2

The following is a utility class that transforms a regular iterable to
an asynchronous one. While this is not a very useful thing to do, the
code illustrates the relationship between regular and asynchronous
iterators.

class AsyncIteratorWrapper:
    def __init__(self, obj):
        self._it = iter(obj)

    def __aiter__(self):
        return self

    async def __anext__(self):
        try:
            value = next(self._it)
        except StopIteration:
            raise StopAsyncIteration
        return value

async for letter in AsyncIteratorWrapper("abc"):
    print(letter)

Why StopAsyncIteration?

Coroutines are still based on generators internally. So, before PEP
479, there was no fundamental difference between

def g1():
    yield from fut
    return 'spam'

and

def g2():
    yield from fut
    raise StopIteration('spam')

And since PEP 479 is accepted and enabled by default for coroutines,
the following example will have its StopIteration wrapped into a
RuntimeError

async def a1():
    await fut
    raise StopIteration('spam')

The only way to tell the outside code that the iteration has ended is
to raise something other than StopIteration. Therefore, a new
built-in exception class StopAsyncIteration was added.

Moreover, with semantics from PEP 479, all StopIteration exceptions
raised in coroutines are wrapped in RuntimeError.

Coroutine objects

Differences from generators

This section applies only to native coroutines with CO_COROUTINE
flag, i.e. defined with the new async def syntax.

The behavior of existing *generator-based coroutines* in asyncio
remains unchanged.

Great effort has been made to make sure that coroutines and
generators are treated as distinct concepts:

  1. Native coroutine objects do not implement __iter__ and
    __next__ methods. Therefore, they cannot be iterated over or
    passed to iter(), list(), tuple() and other built-ins.
    They also cannot be used in a for..in loop.

    An attempt to use __iter__ or __next__ on a native
    coroutine
    object will result in a TypeError.

  2. Plain generators cannot yield from native coroutines:
    doing so will result in a TypeError.
  3. generator-based coroutines (for asyncio code must be decorated
    with @asyncio.coroutine) can yield from native coroutine
    objects
    .
  4. inspect.isgenerator() and inspect.isgeneratorfunction()
    return False for native coroutine objects and native
    coroutine functions
    .

Coroutine object methods

Coroutines are based on generators internally, thus they share the
implementation. Similarly to generator objects, coroutines have
throw(), send() and close() methods. StopIteration and
GeneratorExit play the same role for coroutines (although
PEP 479 is enabled by default for coroutines). See PEP 342, PEP 380,
and Python Documentation [11] for details.

throw(), send() methods for coroutines are used to push
values and raise errors into Future-like objects.

Debugging Features

A common beginner mistake is forgetting to use yield from on
coroutines:

@asyncio.coroutine
def useful():
    asyncio.sleep(1) # this will do nothing without 'yield from'

For debugging this kind of mistakes there is a special debug mode in
asyncio, in which @coroutine decorator wraps all functions with a
special object with a destructor logging a warning. Whenever a wrapped
generator gets garbage collected, a detailed logging message is
generated with information about where exactly the decorator function
was defined, stack trace of where it was collected, etc. Wrapper
object also provides a convenient __repr__ function with detailed
information about the generator.

The only problem is how to enable these debug capabilities. Since
debug facilities should be a no-op in production mode, @coroutine
decorator makes the decision of whether to wrap or not to wrap based on
an OS environment variable PYTHONASYNCIODEBUG. This way it is
possible to run asyncio programs with asyncio’s own functions
instrumented. EventLoop.set_debug, a different debug facility, has
no impact on @coroutine decorator’s behavior.

With this proposal, coroutines is a native, distinct from generators,
concept. In addition to a RuntimeWarning being raised on
coroutines that were never awaited, it is proposed to add two new
functions to the sys module: set_coroutine_wrapper and
get_coroutine_wrapper. This is to enable advanced debugging
facilities in asyncio and other frameworks (such as displaying where
exactly coroutine was created, and a more detailed stack trace of where
it was garbage collected).

New Standard Library Functions

  • types.coroutine(gen). See types.coroutine() section for
    details.
  • inspect.iscoroutine(obj) returns True if obj is a
    native coroutine object.
  • inspect.iscoroutinefunction(obj) returns True if obj is a
    native coroutine function.
  • inspect.isawaitable(obj) returns True if obj is an
    awaitable.
  • inspect.getcoroutinestate(coro) returns the current state of
    a native coroutine object (mirrors
    inspect.getfgeneratorstate(gen)).
  • inspect.getcoroutinelocals(coro) returns the mapping of a
    native coroutine object’s local variables to their values
    (mirrors inspect.getgeneratorlocals(gen)).
  • sys.set_coroutine_wrapper(wrapper) allows to intercept creation of
    native coroutine objects. wrapper must be either a callable that
    accepts one argument (a coroutine object), or None. None
    resets the wrapper. If called twice, the new wrapper replaces the
    previous one. The function is thread-specific. See Debugging
    Features for more details.
  • sys.get_coroutine_wrapper() returns the current wrapper object.
    Returns None if no wrapper was set. The function is
    thread-specific. See Debugging Features for more details.

New Abstract Base Classes

In order to allow better integration with existing frameworks (such as
Tornado, see [13]) and compilers (such as Cython, see [16]), two new
Abstract Base Classes (ABC) are added:

  • collections.abc.Awaitable ABC for Future-like classes, that
    implement __await__ method.
  • collections.abc.Coroutine ABC for coroutine objects, that
    implement send(value), throw(type, exc, tb), close() and
    __await__() methods.

    Note that generator-based coroutines with CO_ITERABLE_COROUTINE
    flag do not implement __await__ method, and therefore are not
    instances of collections.abc.Coroutine and
    collections.abc.Awaitable ABCs:

    @types.coroutine
    def gencoro():
        yield
    
    assert not isinstance(gencoro(), collections.abc.Coroutine)
    
    # however:
    assert inspect.isawaitable(gencoro())
    

To allow easy testing if objects support asynchronous iteration, two
more ABCs are added:

  • collections.abc.AsyncIterable – tests for __aiter__ method.
  • collections.abc.AsyncIterator – tests for __aiter__ and
    __anext__ methods.

Glossary

Native coroutine function
A coroutine function is declared with async def. It uses
await and return value; see New Coroutine Declaration
Syntax for details.
Native coroutine
Returned from a native coroutine function. See Await Expression
for details.
Generator-based coroutine function
Coroutines based on generator syntax. Most common example are
functions decorated with @asyncio.coroutine.
Generator-based coroutine
Returned from a generator-based coroutine function.
Coroutine
Either native coroutine or generator-based coroutine.
Coroutine object
Either native coroutine object or generator-based coroutine
object.
Future-like object
An object with an __await__ method, or a C object with
tp_as_async->am_await function, returning an iterator. Can be
consumed by an await expression in a coroutine. A coroutine
waiting for a Future-like object is suspended until the Future-like
object’s __await__ completes, and returns the result. See
Await Expression for details.
Awaitable
A Future-like object or a coroutine object. See Await
Expression for details.
Asynchronous context manager
An asynchronous context manager has __aenter__ and __aexit__
methods and can be used with async with. See Asynchronous
Context Managers and “async with” for details.
Asynchronous iterable
An object with an __aiter__ method, which must return an
asynchronous iterator object. Can be used with async for.
See Asynchronous Iterators and “async for” for details.
Asynchronous iterator
An asynchronous iterator has an __anext__ method. See
Asynchronous Iterators and “async for” for details.

Transition Plan

To avoid backwards compatibility issues with async and await
keywords, it was decided to modify tokenizer.c in such a way, that
it:

  • recognizes async def NAME tokens combination;
  • while tokenizing async def block, it replaces 'async'
    NAME token with ASYNC, and 'await' NAME token with
    AWAIT;
  • while tokenizing def block, it yields 'async' and 'await'
    NAME tokens as is.

This approach allows for seamless combination of new syntax features
(all of them available only in async functions) with any existing
code.

An example of having “async def” and “async” attribute in one piece of
code:

class Spam:
    async = 42

async def ham():
    print(getattr(Spam, 'async'))

# The coroutine can be executed and will print '42'

Backwards Compatibility

This proposal preserves 100% backwards compatibility.

asyncio

asyncio module was adapted and tested to work with coroutines and
new statements. Backwards compatibility is 100% preserved, i.e. all
existing code will work as-is.

The required changes are mainly:

  1. Modify @asyncio.coroutine decorator to use new
    types.coroutine() function.
  2. Add __await__ = __iter__ line to asyncio.Future class.
  3. Add ensure_future() as an alias for async() function.
    Deprecate async() function.

asyncio migration strategy

Because plain generators cannot yield from native coroutine
objects
(see Differences from generators section for more details),
it is advised to make sure that all generator-based coroutines are
decorated with @asyncio.coroutine before starting to use the new
syntax.

async/await in CPython code base

There is no use of await names in CPython.

async is mostly used by asyncio. We are addressing this by
renaming async() function to ensure_future() (see asyncio
section for details).

Another use of async keyword is in Lib/xml/dom/xmlbuilder.py,
to define an async = False attribute for DocumentLS class.
There is no documentation or tests for it, it is not used anywhere else
in CPython. It is replaced with a getter, that raises a
DeprecationWarning, advising to use async_ attribute instead.
‘async’ attribute is not documented and is not used in CPython code
base.

Grammar Updates

Grammar changes are fairly minimal:

decorated: decorators (classdef | funcdef | async_funcdef)
async_funcdef: ASYNC funcdef

compound_stmt: (if_stmt | while_stmt | for_stmt | try_stmt | with_stmt
                | funcdef | classdef | decorated | async_stmt)

async_stmt: ASYNC (funcdef | with_stmt | for_stmt)

power: atom_expr ['**' factor]
atom_expr: [AWAIT] atom trailer*

Deprecation Plans

async and await names will be softly deprecated in CPython 3.5
and 3.6. In 3.7 we will transform them to proper keywords. Making
async and await proper keywords before 3.7 might make it harder
for people to port their code to Python 3.

Design Considerations

PEP 3152

PEP 3152 by Gregory Ewing proposes a different mechanism for coroutines
(called “cofunctions”). Some key points:

  1. A new keyword codef to declare a cofunction. Cofunction is
    always a generator, even if there is no cocall expressions
    inside it. Maps to async def in this proposal.
  2. A new keyword cocall to call a cofunction. Can only be used
    inside a cofunction. Maps to await in this proposal (with
    some differences, see below).
  3. It is not possible to call a cofunction without a cocall
    keyword.
  4. cocall grammatically requires parentheses after it:
    atom: cocall | <existing alternatives for atom>
    cocall: 'cocall' atom cotrailer* '(' [arglist] ')'
    cotrailer: '[' subscriptlist ']' | '.' NAME
    
  5. cocall f(*args, **kwds) is semantically equivalent to
    yield from f.__cocall__(*args, **kwds).

Differences from this proposal:

  1. There is no equivalent of __cocall__ in this PEP, which is
    called and its result is passed to yield from in the cocall
    expression. await keyword expects an awaitable object,
    validates the type, and executes yield from on it. Although,
    __await__ method is similar to __cocall__, but is only used
    to define Future-like objects.
  2. await is defined in almost the same way as yield from in the
    grammar (it is later enforced that await can only be inside
    async def). It is possible to simply write await future,
    whereas cocall always requires parentheses.
  3. To make asyncio work with PEP 3152 it would be required to modify
    @asyncio.coroutine decorator to wrap all functions in an object
    with a __cocall__ method, or to implement __cocall__ on
    generators. To call cofunctions from existing generator-based
    coroutines it would be required to use costart(cofunc, *args,
    **kwargs)
    built-in.
  4. Since it is impossible to call a cofunction without a cocall
    keyword, it automatically prevents the common mistake of forgetting
    to use yield from on generator-based coroutines. This proposal
    addresses this problem with a different approach, see Debugging
    Features.
  5. A shortcoming of requiring a cocall keyword to call a coroutine
    is that if is decided to implement coroutine-generators –
    coroutines with yield or async yield expressions – we
    wouldn’t need a cocall keyword to call them. So we’ll end up
    having __cocall__ and no __call__ for regular coroutines,
    and having __call__ and no __cocall__ for coroutine-
    generators.
  6. Requiring parentheses grammatically also introduces a whole lot
    of new problems.

    The following code:

    await fut
    await function_returning_future()
    await asyncio.gather(coro1(arg1, arg2), coro2(arg1, arg2))
    

    would look like:

    cocall fut()  # or cocall costart(fut)
    cocall (function_returning_future())()
    cocall asyncio.gather(costart(coro1, arg1, arg2),
                          costart(coro2, arg1, arg2))
    
  7. There are no equivalents of async for and async with in PEP
    3152.

Coroutine-generators

With async for keyword it is desirable to have a concept of a
coroutine-generator – a coroutine with yield and yield from
expressions. To avoid any ambiguity with regular generators, we would
likely require to have an async keyword before yield, and
async yield from would raise a StopAsyncIteration exception.

While it is possible to implement coroutine-generators, we believe that
they are out of scope of this proposal. It is an advanced concept that
should be carefully considered and balanced, with a non-trivial changes
in the implementation of current generator objects. This is a matter
for a separate PEP.

Why “async” and “await” keywords

async/await is not a new concept in programming languages:

  • C# has it since long time ago [5];
  • proposal to add async/await in ECMAScript 7 [2];
    see also Traceur project [9];
  • Facebook’s Hack/HHVM [6];
  • Google’s Dart language [7];
  • Scala [8];
  • proposal to add async/await to C++ [10];
  • and many other less popular languages.

This is a huge benefit, as some users already have experience with
async/await, and because it makes working with many languages in one
project easier (Python with ECMAScript 7 for instance).

Why “__aiter__” does not return an awaitable

PEP 492 was accepted in CPython 3.5.0 with __aiter__ defined as
a method, that was expected to return an awaitable resolving to an
asynchronous iterator.

In 3.5.2 (as PEP 492 was accepted on a provisional basis) the
__aiter__ protocol was updated to return asynchronous iterators
directly.

The motivation behind this change is to make it possible to
implement asynchronous generators in Python. See [19] and [20] for
more details.

Importance of “async” keyword

While it is possible to just implement await expression and treat
all functions with at least one await as coroutines, this approach
makes APIs design, code refactoring and its long time support harder.

Let’s pretend that Python only has await keyword:

def useful():
    ...
    await log(...)
    ...

def important():
    await useful()

If useful() function is refactored and someone removes all
await expressions from it, it would become a regular python
function, and all code that depends on it, including important()
would be broken. To mitigate this issue a decorator similar to
@asyncio.coroutine has to be introduced.

Why “async def”

For some people bare async name(): pass syntax might look more
appealing than async def name(): pass. It is certainly easier to
type. But on the other hand, it breaks the symmetry between async
def
, async with and async for, where async is a modifier,
stating that the statement is asynchronous. It is also more consistent
with the existing grammar.

Why not “await for” and “await with”

async is an adjective, and hence it is a better choice for a
statement qualifier keyword. await for/with would imply that
something is awaiting for a completion of a for or with
statement.

Why “async def” and not “def async”

async keyword is a statement qualifier. A good analogy to it are
“static”, “public”, “unsafe” keywords from other languages. “async
for” is an asynchronous “for” statement, “async with” is an
asynchronous “with” statement, “async def” is an asynchronous function.

Having “async” after the main statement keyword might introduce some
confusion, like “for async item in iterator” can be read as “for each
asynchronous item in iterator”.

Having async keyword before def, with and for also
makes the language grammar simpler. And “async def” better separates
coroutines from regular functions visually.

Why not a __future__ import

Transition Plan section explains how tokenizer is modified to treat
async and await as keywords only in async def blocks.
Hence async def fills the role that a module level compiler
declaration like from __future__ import async_await would otherwise
fill.

Why magic methods start with “a”

New asynchronous magic methods __aiter__, __anext__,
__aenter__, and __aexit__ all start with the same prefix “a”.
An alternative proposal is to use “async” prefix, so that __anext__
becomes __async_next__. However, to align new magic methods with
the existing ones, such as __radd__ and __iadd__ it was decided
to use a shorter version.

Why not reuse existing magic names

An alternative idea about new asynchronous iterators and context
managers was to reuse existing magic methods, by adding an async
keyword to their declarations:

class CM:
    async def __enter__(self): # instead of __aenter__
        ...

This approach has the following downsides:

  • it would not be possible to create an object that works in both
    with and async with statements;
  • it would break backwards compatibility, as nothing prohibits from
    returning a Future-like objects from __enter__ and/or
    __exit__ in Python <= 3.4;
  • one of the main points of this proposal is to make native coroutines
    as simple and foolproof as possible, hence the clear separation of
    the protocols.

Why not reuse existing “for” and “with” statements

The vision behind existing generator-based coroutines and this proposal
is to make it easy for users to see where the code might be suspended.
Making existing “for” and “with” statements to recognize asynchronous
iterators and context managers will inevitably create implicit suspend
points, making it harder to reason about the code.

Comprehensions

Syntax for asynchronous comprehensions could be provided, but this
construct is outside of the scope of this PEP.

Async lambda functions

Syntax for asynchronous lambda functions could be provided, but this
construct is outside of the scope of this PEP.

Performance

Overall Impact

This proposal introduces no observable performance impact. Here is an
output of python’s official set of benchmarks [4]:

python perf.py -r -b default ../cpython/python.exe ../cpython-aw/python.exe

[skipped]

Report on Darwin ysmac 14.3.0 Darwin Kernel Version 14.3.0:
Mon Mar 23 11:59:05 PDT 2015; root:xnu-2782.20.48~5/RELEASE_X86_64
x86_64 i386

Total CPU cores: 8

### etree_iterparse ###
Min: 0.365359 -> 0.349168: 1.05x faster
Avg: 0.396924 -> 0.379735: 1.05x faster
Significant (t=9.71)
Stddev: 0.01225 -> 0.01277: 1.0423x larger

The following not significant results are hidden, use -v to show them:
django_v2, 2to3, etree_generate, etree_parse, etree_process, fastpickle,
fastunpickle, json_dump_v2, json_load, nbody, regex_v8, tornado_http.

Tokenizer modifications

There is no observable slowdown of parsing python files with the
modified tokenizer: parsing of one 12Mb file
(Lib/test/test_binop.py repeated 1000 times) takes the same amount
of time.

async/await

The following micro-benchmark was used to determine performance
difference between “async” functions and generators:

import sys
import time

def binary(n):
    if n <= 0:
        return 1
    l = yield from binary(n - 1)
    r = yield from binary(n - 1)
    return l + 1 + r

async def abinary(n):
    if n <= 0:
        return 1
    l = await abinary(n - 1)
    r = await abinary(n - 1)
    return l + 1 + r

def timeit(func, depth, repeat):
    t0 = time.time()
    for _ in range(repeat):
        o = func(depth)
        try:
            while True:
                o.send(None)
        except StopIteration:
            pass
    t1 = time.time()
    print('{}({}) * {}: total {:.3f}s'.format(
        func.__name__, depth, repeat, t1-t0))

The result is that there is no observable performance difference:

binary(19) * 30: total 53.321s
abinary(19) * 30: total 55.073s

binary(19) * 30: total 53.361s
abinary(19) * 30: total 51.360s

binary(19) * 30: total 49.438s
abinary(19) * 30: total 51.047s

Note that depth of 19 means 1,048,575 calls.

Reference Implementation

The reference implementation can be found here: [3].

List of high-level changes and new protocols

  1. New syntax for defining coroutines: async def and new await
    keyword.
  2. New __await__ method for Future-like objects, and new
    tp_as_async.am_await slot in PyTypeObject.
  3. New syntax for asynchronous context managers: async with. And
    associated protocol with __aenter__ and __aexit__ methods.
  4. New syntax for asynchronous iteration: async for. And
    associated protocol with __aiter__, __aexit__ and new built-
    in exception StopAsyncIteration. New tp_as_async.am_aiter
    and tp_as_async.am_anext slots in PyTypeObject.
  5. New AST nodes: AsyncFunctionDef, AsyncFor, AsyncWith,
    Await.
  6. New functions: sys.set_coroutine_wrapper(callback),
    sys.get_coroutine_wrapper(), types.coroutine(gen),
    inspect.iscoroutinefunction(func), inspect.iscoroutine(obj),
    inspect.isawaitable(obj), inspect.getcoroutinestate(coro),
    and inspect.getcoroutinelocals(coro).
  7. New CO_COROUTINE and CO_ITERABLE_COROUTINE bit flags for code
    objects.
  8. New ABCs: collections.abc.Awaitable,
    collections.abc.Coroutine, collections.abc.AsyncIterable, and
    collections.abc.AsyncIterator.
  9. C API changes: new PyCoro_Type (exposed to Python as
    types.CoroutineType) and PyCoroObject.
    PyCoro_CheckExact(*o) to test if o is a native coroutine.

While the list of changes and new things is not short, it is important
to understand, that most users will not use these features directly.
It is intended to be used in frameworks and libraries to provide users
with convenient to use and unambiguous APIs with async def,
await, async for and async with syntax.

Working example

All concepts proposed in this PEP are implemented [3] and can be
tested.

import asyncio

async def echo_server():
    print('Serving on localhost:8000')
    await asyncio.start_server(handle_connection,
                               'localhost', 8000)

async def handle_connection(reader, writer):
    print('New connection...')

    while True:
        data = await reader.read(8192)

        if not data:
            break

        print('Sending {:.10}... back'.format(repr(data)))
        writer.write(data)

loop = asyncio.get_event_loop()
loop.run_until_complete(echo_server())
try:
    loop.run_forever()
finally:
    loop.close()

Acceptance

PEP 492 was accepted by Guido, Tuesday, May 5, 2015 [14].

Implementation

The implementation is tracked in issue 24017 [15]. It was
committed on May 11, 2015.

References

Acknowledgments

I thank Guido van Rossum, Victor Stinner, Elvis Pranskevichus, Andrew
Svetlov, Łukasz Langa, Greg Ewing, Stephen J. Turnbull, Jim J. Jewett,
Brett Cannon, Nick Coghlan, Steven D’Aprano, Paul Moore, Nathaniel
Smith, Ethan Furman, Stefan Behnel, Paul Sokolovsky, Victor Petrovykh,
and many others for their feedback, ideas, edits, criticism, code
reviews, and discussions around this PEP.

Copyright

This document has been placed in the public domain.

Содержание

  1. Python Enhancement Proposals
  2. PEP 492 – Coroutines with async and await syntax
  3. Abstract
  4. API Design and Implementation Revisions
  5. Rationale and Goals
  6. Specification
  7. New Coroutine Declaration Syntax
  8. types.coroutine()
  9. Await Expression
  10. Updated operator precedence table
  11. Examples of “await” expressions
  12. Asynchronous Context Managers and “async with”
  13. New Syntax
  14. Example
  15. Asynchronous Iterators and “async for”
  16. New Syntax
  17. Example 1
  18. Example 2
  19. Why StopAsyncIteration?
  20. Coroutine objects
  21. Differences from generators
  22. Coroutine object methods
  23. Debugging Features
  24. New Standard Library Functions
  25. New Abstract Base Classes
  26. Glossary
  27. Transition Plan
  28. Backwards Compatibility
  29. asyncio
  30. asyncio migration strategy
  31. async/await in CPython code base
  32. Grammar Updates
  33. Deprecation Plans
  34. Design Considerations
  35. PEP 3152
  36. Coroutine-generators
  37. Why “async” and “await” keywords
  38. Why “__aiter__” does not return an awaitable
  39. Importance of “async” keyword
  40. Why “async def”
  41. Why not “await for” and “await with”
  42. Why “async def” and not “def async”
  43. Why not a __future__ import
  44. Why magic methods start with “a”
  45. Why not reuse existing magic names
  46. Why not reuse existing “for” and “with” statements
  47. Comprehensions
  48. Async lambda functions
  49. Performance
  50. Overall Impact
  51. Tokenizer modifications
  52. async/await
  53. Reference Implementation
  54. List of high-level changes and new protocols
  55. Working example
  56. Acceptance
  57. Implementation
  58. References
  59. Acknowledgments
  60. Copyright

Python Enhancement Proposals

PEP 492 – Coroutines with async and await syntax

Abstract

The growth of Internet and general connectivity has triggered the proportionate need for responsive and scalable code. This proposal aims to answer that need by making writing explicitly asynchronous, concurrent Python code easier and more Pythonic.

It is proposed to make coroutines a proper standalone concept in Python, and introduce new supporting syntax. The ultimate goal is to help establish a common, easily approachable, mental model of asynchronous programming in Python and make it as close to synchronous programming as possible.

This PEP assumes that the asynchronous tasks are scheduled and coordinated by an Event Loop similar to that of stdlib module asyncio.events.AbstractEventLoop . While the PEP is not tied to any specific Event Loop implementation, it is relevant only to the kind of coroutine that uses yield as a signal to the scheduler, indicating that the coroutine will be waiting until an event (such as IO) is completed.

We believe that the changes proposed here will help keep Python relevant and competitive in a quickly growing area of asynchronous programming, as many other languages have adopted, or are planning to adopt, similar features: [2], [5], [6], [7], [8], [10].

API Design and Implementation Revisions

  1. Feedback on the initial beta release of Python 3.5 resulted in a redesign of the object model supporting this PEP to more clearly separate native coroutines from generators — rather than being a new kind of generator, native coroutines are now their own completely distinct type (implemented in [17]).

This change was implemented based primarily due to problems encountered attempting to integrate support for native coroutines into the Tornado web server (reported in [18]).

In CPython 3.5.2, the __aiter__ protocol was updated.

Before 3.5.2, __aiter__ was expected to return an awaitable resolving to an asynchronous iterator. Starting with 3.5.2, __aiter__ should return asynchronous iterators directly.

If the old protocol is used in 3.5.2, Python will raise a PendingDeprecationWarning .

In CPython 3.6, the old __aiter__ protocol will still be supported with a DeprecationWarning being raised.

In CPython 3.7, the old __aiter__ protocol will no longer be supported: a RuntimeError will be raised if __aiter__ returns anything but an asynchronous iterator.

See [19] and [20] for more details.

Rationale and Goals

Current Python supports implementing coroutines via generators (PEP 342), further enhanced by the yield from syntax introduced in PEP 380. This approach has a number of shortcomings:

  • It is easy to confuse coroutines with regular generators, since they share the same syntax; this is especially true for new developers.
  • Whether or not a function is a coroutine is determined by a presence of yield or yield from statements in its body, which can lead to unobvious errors when such statements appear in or disappear from function body during refactoring.
  • Support for asynchronous calls is limited to expressions where yield is allowed syntactically, limiting the usefulness of syntactic features, such as with and for statements.

This proposal makes coroutines a native Python language feature, and clearly separates them from generators. This removes generator/coroutine ambiguity, and makes it possible to reliably define coroutines without reliance on a specific library. This also enables linters and IDEs to improve static code analysis and refactoring.

Native coroutines and the associated new syntax features make it possible to define context manager and iteration protocols in asynchronous terms. As shown later in this proposal, the new async with statement lets Python programs perform asynchronous calls when entering and exiting a runtime context, and the new async for statement makes it possible to perform asynchronous calls in iterators.

Specification

This proposal introduces new syntax and semantics to enhance coroutine support in Python.

This specification presumes knowledge of the implementation of coroutines in Python (PEP 342 and PEP 380). Motivation for the syntax changes proposed here comes from the asyncio framework (PEP 3156) and the “Cofunctions” proposal (PEP 3152, now rejected in favor of this specification).

From this point in this document we use the word native coroutine to refer to functions declared using the new syntax. generator-based coroutine is used where necessary to refer to coroutines that are based on generator syntax. coroutine is used in contexts where both definitions are applicable.

New Coroutine Declaration Syntax

The following new syntax is used to declare a native coroutine:

Key properties of coroutines:

  • async def functions are always coroutines, even if they do not contain await expressions.
  • It is a SyntaxError to have yield or yield from expressions in an async function.
  • Internally, two new code object flags were introduced:
    • CO_COROUTINE is used to mark native coroutines (defined with new syntax).
    • CO_ITERABLE_COROUTINE is used to make generator-based coroutines compatible with native coroutines (set by types.coroutine() function).
  • Regular generators, when called, return a generator object; similarly, coroutines return a coroutine object.
  • StopIteration exceptions are not propagated out of coroutines, and are replaced with a RuntimeError . For regular generators such behavior requires a future import (see PEP 479).
  • When a native coroutine is garbage collected, a RuntimeWarning is raised if it was never awaited on (see also Debugging Features).
  • See also Coroutine objects section.

types.coroutine()

A new function coroutine(fn) is added to the types module. It allows interoperability between existing generator-based coroutines in asyncio and native coroutines introduced by this PEP:

The function applies CO_ITERABLE_COROUTINE flag to generator- function’s code object, making it return a coroutine object.

If fn is not a generator function, it is wrapped. If it returns a generator, it will be wrapped in an awaitable proxy object (see below the definition of awaitable objects).

Note, that the CO_COROUTINE flag is not applied by types.coroutine() to make it possible to separate native coroutines defined with new syntax, from generator-based coroutines.

Await Expression

The following new await expression is used to obtain a result of coroutine execution:

await , similarly to yield from , suspends execution of read_data coroutine until db.fetch awaitable completes and returns the result data.

It uses the yield from implementation with an extra step of validating its argument. await only accepts an awaitable, which can be one of:

  • A native coroutine object returned from a native coroutine function.
  • A generator-based coroutine object returned from a function decorated with types.coroutine() .
  • An object with an __await__ method returning an iterator.

Any yield from chain of calls ends with a yield . This is a fundamental mechanism of how Futures are implemented. Since, internally, coroutines are a special kind of generators, every await is suspended by a yield somewhere down the chain of await calls (please refer to PEP 3156 for a detailed explanation).

To enable this behavior for coroutines, a new magic method called __await__ is added. In asyncio, for instance, to enable Future objects in await statements, the only change is to add __await__ = __iter__ line to asyncio.Future class.

Objects with __await__ method are called Future-like objects in the rest of this PEP.

It is a TypeError if __await__ returns anything but an iterator.

  • Objects defined with CPython C API with a tp_as_async.am_await function, returning an iterator (similar to __await__ method).
  • It is a SyntaxError to use await outside of an async def function (like it is a SyntaxError to use yield outside of def function).

    It is a TypeError to pass anything other than an awaitable object to an await expression.

    Updated operator precedence table

    await keyword is defined as follows:

    where “primary” represents the most tightly bound operations of the language. Its syntax is:

    See Python Documentation [12] and Grammar Updates section of this proposal for details.

    The key await difference from yield and yield from operators is that await expressions do not require parentheses around them most of the times.

    Also, yield from allows any expression as its argument, including expressions like yield from a() + b() , that would be parsed as yield from (a() + b()) , which is almost always a bug. In general, the result of any arithmetic operation is not an awaitable object. To avoid this kind of mistakes, it was decided to make await precedence lower than [] , () , and . , but higher than ** operators.

    Operator Description
    yield x , yield from x Yield expression
    lambda Lambda expression
    if – else Conditional expression
    or Boolean OR
    and Boolean AND
    not x Boolean NOT
    in , not in , is , is not , , , > , >= , != , == Comparisons, including membership tests and identity tests
    | Bitwise OR
    ^ Bitwise XOR
    & Bitwise AND
    , >> Shifts
    + , — Addition and subtraction
    * , @ , / , // , % Multiplication, matrix multiplication, division, remainder
    +x , -x ,

    x

    Positive, negative, bitwise NOT
    ** Exponentiation
    await x Await expression
    x[index] , x[index:index] , x(arguments. ) , x.attribute Subscription, slicing, call, attribute reference
    (expressions. ) , [expressions. ] , , Binding or tuple display, list display, dictionary display, set display

    Examples of “await” expressions

    Valid syntax examples:

    Expression Will be parsed as
    if await fut: pass if (await fut): pass
    if await fut + 1: pass if (await fut) + 1: pass
    pair = await fut, ‘spam’ pair = (await fut), ‘spam’
    with await fut, open(): pass with (await fut), open(): pass
    await foo()[‘spam’].baz()() await ( foo()[‘spam’].baz()() )
    return await coro() return ( await coro() )
    res = await coro() ** 2 res = (await coro()) ** 2
    func(a1=await coro(), a2=0) func(a1=(await coro()), a2=0)
    await foo() + await bar() (await foo()) + (await bar())
    -await foo() -(await foo())

    Invalid syntax examples:

    Expression Should be written as
    await await coro() await (await coro())
    await -coro() await (-coro())

    Asynchronous Context Managers and “async with”

    An asynchronous context manager is a context manager that is able to suspend execution in its enter and exit methods.

    To make this possible, a new protocol for asynchronous context managers is proposed. Two new magic methods are added: __aenter__ and __aexit__ . Both must return an awaitable.

    An example of an asynchronous context manager:

    New Syntax

    A new statement for asynchronous context managers is proposed:

    which is semantically equivalent to:

    As with regular with statements, it is possible to specify multiple context managers in a single async with statement.

    It is an error to pass a regular context manager without __aenter__ and __aexit__ methods to async with . It is a SyntaxError to use async with outside of an async def function.

    Example

    With asynchronous context managers it is easy to implement proper database transaction managers for coroutines:

    Code that needs locking also looks lighter:

    Asynchronous Iterators and “async for”

    An asynchronous iterable is able to call asynchronous code in its iter implementation, and asynchronous iterator can call asynchronous code in its next method. To support asynchronous iteration:

    1. An object must implement an __aiter__ method (or, if defined with CPython C API, tp_as_async.am_aiter slot) returning an asynchronous iterator object.
    2. An asynchronous iterator object must implement an __anext__ method (or, if defined with CPython C API, tp_as_async.am_anext slot) returning an awaitable.
    3. To stop iteration __anext__ must raise a StopAsyncIteration exception.

    An example of asynchronous iterable:

    New Syntax

    A new statement for iterating through asynchronous iterators is proposed:

    which is semantically equivalent to:

    It is a TypeError to pass a regular iterable without __aiter__ method to async for . It is a SyntaxError to use async for outside of an async def function.

    As for with regular for statement, async for has an optional else clause.

    Example 1

    With asynchronous iteration protocol it is possible to asynchronously buffer data during iteration:

    Where cursor is an asynchronous iterator that prefetches N rows of data from a database after every N iterations.

    The following code illustrates new asynchronous iteration protocol:

    then the Cursor class can be used as follows:

    which would be equivalent to the following code:

    Example 2

    The following is a utility class that transforms a regular iterable to an asynchronous one. While this is not a very useful thing to do, the code illustrates the relationship between regular and asynchronous iterators.

    Why StopAsyncIteration?

    Coroutines are still based on generators internally. So, before PEP 479, there was no fundamental difference between

    And since PEP 479 is accepted and enabled by default for coroutines, the following example will have its StopIteration wrapped into a RuntimeError

    The only way to tell the outside code that the iteration has ended is to raise something other than StopIteration . Therefore, a new built-in exception class StopAsyncIteration was added.

    Moreover, with semantics from PEP 479, all StopIteration exceptions raised in coroutines are wrapped in RuntimeError .

    Coroutine objects

    Differences from generators

    This section applies only to native coroutines with CO_COROUTINE flag, i.e. defined with the new async def syntax.

    The behavior of existing *generator-based coroutines* in asyncio remains unchanged.

    Great effort has been made to make sure that coroutines and generators are treated as distinct concepts:

      Native coroutine objects do not implement __iter__ and __next__ methods. Therefore, they cannot be iterated over or passed to iter() , list() , tuple() and other built-ins. They also cannot be used in a for..in loop.

    An attempt to use __iter__ or __next__ on a native coroutine object will result in a TypeError .

  • Plain generators cannot yield from native coroutines: doing so will result in a TypeError .
  • generator-based coroutines (for asyncio code must be decorated with @asyncio.coroutine ) can yield from native coroutine objects.
  • inspect.isgenerator() and inspect.isgeneratorfunction() return False for native coroutine objects and native coroutine functions.
  • Coroutine object methods

    Coroutines are based on generators internally, thus they share the implementation. Similarly to generator objects, coroutines have throw() , send() and close() methods. StopIteration and GeneratorExit play the same role for coroutines (although PEP 479 is enabled by default for coroutines). See PEP 342, PEP 380, and Python Documentation [11] for details.

    throw() , send() methods for coroutines are used to push values and raise errors into Future-like objects.

    Debugging Features

    A common beginner mistake is forgetting to use yield from on coroutines:

    For debugging this kind of mistakes there is a special debug mode in asyncio, in which @coroutine decorator wraps all functions with a special object with a destructor logging a warning. Whenever a wrapped generator gets garbage collected, a detailed logging message is generated with information about where exactly the decorator function was defined, stack trace of where it was collected, etc. Wrapper object also provides a convenient __repr__ function with detailed information about the generator.

    The only problem is how to enable these debug capabilities. Since debug facilities should be a no-op in production mode, @coroutine decorator makes the decision of whether to wrap or not to wrap based on an OS environment variable PYTHONASYNCIODEBUG . This way it is possible to run asyncio programs with asyncio’s own functions instrumented. EventLoop.set_debug , a different debug facility, has no impact on @coroutine decorator’s behavior.

    With this proposal, coroutines is a native, distinct from generators, concept. In addition to a RuntimeWarning being raised on coroutines that were never awaited, it is proposed to add two new functions to the sys module: set_coroutine_wrapper and get_coroutine_wrapper . This is to enable advanced debugging facilities in asyncio and other frameworks (such as displaying where exactly coroutine was created, and a more detailed stack trace of where it was garbage collected).

    New Standard Library Functions

    • types.coroutine(gen) . See types.coroutine() section for details.
    • inspect.iscoroutine(obj) returns True if obj is a native coroutine object.
    • inspect.iscoroutinefunction(obj) returns True if obj is a native coroutine function.
    • inspect.isawaitable(obj) returns True if obj is an awaitable.
    • inspect.getcoroutinestate(coro) returns the current state of a native coroutine object (mirrors inspect.getfgeneratorstate(gen) ).
    • inspect.getcoroutinelocals(coro) returns the mapping of a native coroutine object’s local variables to their values (mirrors inspect.getgeneratorlocals(gen) ).
    • sys.set_coroutine_wrapper(wrapper) allows to intercept creation of native coroutine objects. wrapper must be either a callable that accepts one argument (a coroutine object), or None . None resets the wrapper. If called twice, the new wrapper replaces the previous one. The function is thread-specific. See Debugging Features for more details.
    • sys.get_coroutine_wrapper() returns the current wrapper object. Returns None if no wrapper was set. The function is thread-specific. See Debugging Features for more details.

    New Abstract Base Classes

    In order to allow better integration with existing frameworks (such as Tornado, see [13]) and compilers (such as Cython, see [16]), two new Abstract Base Classes (ABC) are added:

    • collections.abc.Awaitable ABC for Future-like classes, that implement __await__ method.
    • collections.abc.Coroutine ABC for coroutine objects, that implement send(value) , throw(type, exc, tb) , close() and __await__() methods.

    Note that generator-based coroutines with CO_ITERABLE_COROUTINE flag do not implement __await__ method, and therefore are not instances of collections.abc.Coroutine and collections.abc.Awaitable ABCs:

    To allow easy testing if objects support asynchronous iteration, two more ABCs are added:

    • collections.abc.AsyncIterable – tests for __aiter__ method.
    • collections.abc.AsyncIterator – tests for __aiter__ and __anext__ methods.

    Glossary

    Transition Plan

    To avoid backwards compatibility issues with async and await keywords, it was decided to modify tokenizer.c in such a way, that it:

    • recognizes async def NAME tokens combination;
    • while tokenizing async def block, it replaces ‘async’ NAME token with ASYNC , and ‘await’ NAME token with AWAIT ;
    • while tokenizing def block, it yields ‘async’ and ‘await’ NAME tokens as is.

    This approach allows for seamless combination of new syntax features (all of them available only in async functions) with any existing code.

    An example of having “async def” and “async” attribute in one piece of code:

    Backwards Compatibility

    This proposal preserves 100% backwards compatibility.

    asyncio

    asyncio module was adapted and tested to work with coroutines and new statements. Backwards compatibility is 100% preserved, i.e. all existing code will work as-is.

    The required changes are mainly:

    1. Modify @asyncio.coroutine decorator to use new types.coroutine() function.
    2. Add __await__ = __iter__ line to asyncio.Future class.
    3. Add ensure_future() as an alias for async() function. Deprecate async() function.

    asyncio migration strategy

    Because plain generators cannot yield from native coroutine objects (see Differences from generators section for more details), it is advised to make sure that all generator-based coroutines are decorated with @asyncio.coroutine before starting to use the new syntax.

    async/await in CPython code base

    There is no use of await names in CPython.

    async is mostly used by asyncio. We are addressing this by renaming async() function to ensure_future() (see asyncio section for details).

    Another use of async keyword is in Lib/xml/dom/xmlbuilder.py , to define an async = False attribute for DocumentLS class. There is no documentation or tests for it, it is not used anywhere else in CPython. It is replaced with a getter, that raises a DeprecationWarning , advising to use async_ attribute instead. ‘async’ attribute is not documented and is not used in CPython code base.

    Grammar Updates

    Grammar changes are fairly minimal:

    Deprecation Plans

    async and await names will be softly deprecated in CPython 3.5 and 3.6. In 3.7 we will transform them to proper keywords. Making async and await proper keywords before 3.7 might make it harder for people to port their code to Python 3.

    Design Considerations

    PEP 3152

    PEP 3152 by Gregory Ewing proposes a different mechanism for coroutines (called “cofunctions”). Some key points:

    1. A new keyword codef to declare a cofunction. Cofunction is always a generator, even if there is no cocall expressions inside it. Maps to async def in this proposal.
    2. A new keyword cocall to call a cofunction. Can only be used inside a cofunction. Maps to await in this proposal (with some differences, see below).
    3. It is not possible to call a cofunction without a cocall keyword.
    4. cocall grammatically requires parentheses after it:

    Differences from this proposal:

    1. There is no equivalent of __cocall__ in this PEP, which is called and its result is passed to yield from in the cocall expression. await keyword expects an awaitable object, validates the type, and executes yield from on it. Although, __await__ method is similar to __cocall__ , but is only used to define Future-like objects.
    2. await is defined in almost the same way as yield from in the grammar (it is later enforced that await can only be inside async def ). It is possible to simply write await future , whereas cocall always requires parentheses.
    3. To make asyncio work with PEP 3152 it would be required to modify @asyncio.coroutine decorator to wrap all functions in an object with a __cocall__ method, or to implement __cocall__ on generators. To call cofunctions from existing generator-based coroutines it would be required to use costart(cofunc, *args, **kwargs) built-in.
    4. Since it is impossible to call a cofunction without a cocall keyword, it automatically prevents the common mistake of forgetting to use yield from on generator-based coroutines. This proposal addresses this problem with a different approach, see Debugging Features.
    5. A shortcoming of requiring a cocall keyword to call a coroutine is that if is decided to implement coroutine-generators – coroutines with yield or async yield expressions – we wouldn’t need a cocall keyword to call them. So we’ll end up having __cocall__ and no __call__ for regular coroutines, and having __call__ and no __cocall__ for coroutine- generators.
    6. Requiring parentheses grammatically also introduces a whole lot of new problems.

    The following code:

    would look like:

    Coroutine-generators

    With async for keyword it is desirable to have a concept of a coroutine-generator – a coroutine with yield and yield from expressions. To avoid any ambiguity with regular generators, we would likely require to have an async keyword before yield , and async yield from would raise a StopAsyncIteration exception.

    While it is possible to implement coroutine-generators, we believe that they are out of scope of this proposal. It is an advanced concept that should be carefully considered and balanced, with a non-trivial changes in the implementation of current generator objects. This is a matter for a separate PEP.

    Why “async” and “await” keywords

    async/await is not a new concept in programming languages:

    • C# has it since long time ago [5];
    • proposal to add async/await in ECMAScript 7 [2]; see also Traceur project [9];
    • Facebook’s Hack/HHVM [6];
    • Google’s Dart language [7];
    • Scala [8];
    • proposal to add async/await to C++ [10];
    • and many other less popular languages.

    This is a huge benefit, as some users already have experience with async/await, and because it makes working with many languages in one project easier (Python with ECMAScript 7 for instance).

    Why “__aiter__” does not return an awaitable

    PEP 492 was accepted in CPython 3.5.0 with __aiter__ defined as a method, that was expected to return an awaitable resolving to an asynchronous iterator.

    In 3.5.2 (as PEP 492 was accepted on a provisional basis) the __aiter__ protocol was updated to return asynchronous iterators directly.

    The motivation behind this change is to make it possible to implement asynchronous generators in Python. See [19] and [20] for more details.

    Importance of “async” keyword

    While it is possible to just implement await expression and treat all functions with at least one await as coroutines, this approach makes APIs design, code refactoring and its long time support harder.

    Let’s pretend that Python only has await keyword:

    If useful() function is refactored and someone removes all await expressions from it, it would become a regular python function, and all code that depends on it, including important() would be broken. To mitigate this issue a decorator similar to @asyncio.coroutine has to be introduced.

    Why “async def”

    For some people bare async name(): pass syntax might look more appealing than async def name(): pass . It is certainly easier to type. But on the other hand, it breaks the symmetry between async def , async with and async for , where async is a modifier, stating that the statement is asynchronous. It is also more consistent with the existing grammar.

    Why not “await for” and “await with”

    async is an adjective, and hence it is a better choice for a statement qualifier keyword. await for/with would imply that something is awaiting for a completion of a for or with statement.

    Why “async def” and not “def async”

    async keyword is a statement qualifier. A good analogy to it are “static”, “public”, “unsafe” keywords from other languages. “async for” is an asynchronous “for” statement, “async with” is an asynchronous “with” statement, “async def” is an asynchronous function.

    Having “async” after the main statement keyword might introduce some confusion, like “for async item in iterator” can be read as “for each asynchronous item in iterator”.

    Having async keyword before def , with and for also makes the language grammar simpler. And “async def” better separates coroutines from regular functions visually.

    Why not a __future__ import

    Transition Plan section explains how tokenizer is modified to treat async and await as keywords only in async def blocks. Hence async def fills the role that a module level compiler declaration like from __future__ import async_await would otherwise fill.

    Why magic methods start with “a”

    New asynchronous magic methods __aiter__ , __anext__ , __aenter__ , and __aexit__ all start with the same prefix “a”. An alternative proposal is to use “async” prefix, so that __anext__ becomes __async_next__ . However, to align new magic methods with the existing ones, such as __radd__ and __iadd__ it was decided to use a shorter version.

    Why not reuse existing magic names

    An alternative idea about new asynchronous iterators and context managers was to reuse existing magic methods, by adding an async keyword to their declarations:

    This approach has the following downsides:

    • it would not be possible to create an object that works in both with and async with statements;
    • it would break backwards compatibility, as nothing prohibits from returning a Future-like objects from __enter__ and/or __exit__ in Python

    Why not reuse existing “for” and “with” statements

    The vision behind existing generator-based coroutines and this proposal is to make it easy for users to see where the code might be suspended. Making existing “for” and “with” statements to recognize asynchronous iterators and context managers will inevitably create implicit suspend points, making it harder to reason about the code.

    Comprehensions

    Syntax for asynchronous comprehensions could be provided, but this construct is outside of the scope of this PEP.

    Async lambda functions

    Syntax for asynchronous lambda functions could be provided, but this construct is outside of the scope of this PEP.

    Performance

    Overall Impact

    This proposal introduces no observable performance impact. Here is an output of python’s official set of benchmarks [4]:

    Tokenizer modifications

    There is no observable slowdown of parsing python files with the modified tokenizer: parsing of one 12Mb file ( Lib/test/test_binop.py repeated 1000 times) takes the same amount of time.

    async/await

    The following micro-benchmark was used to determine performance difference between “async” functions and generators:

    The result is that there is no observable performance difference:

    Note that depth of 19 means 1,048,575 calls.

    Reference Implementation

    The reference implementation can be found here: [3].

    List of high-level changes and new protocols

    1. New syntax for defining coroutines: async def and new await keyword.
    2. New __await__ method for Future-like objects, and new tp_as_async.am_await slot in PyTypeObject .
    3. New syntax for asynchronous context managers: async with . And associated protocol with __aenter__ and __aexit__ methods.
    4. New syntax for asynchronous iteration: async for . And associated protocol with __aiter__ , __aexit__ and new built- in exception StopAsyncIteration . New tp_as_async.am_aiter and tp_as_async.am_anext slots in PyTypeObject .
    5. New AST nodes: AsyncFunctionDef , AsyncFor , AsyncWith , Await .
    6. New functions: sys.set_coroutine_wrapper(callback) , sys.get_coroutine_wrapper() , types.coroutine(gen) , inspect.iscoroutinefunction(func) , inspect.iscoroutine(obj) , inspect.isawaitable(obj) , inspect.getcoroutinestate(coro) , and inspect.getcoroutinelocals(coro) .
    7. New CO_COROUTINE and CO_ITERABLE_COROUTINE bit flags for code objects.
    8. New ABCs: collections.abc.Awaitable , collections.abc.Coroutine , collections.abc.AsyncIterable , and collections.abc.AsyncIterator .
    9. C API changes: new PyCoro_Type (exposed to Python as types.CoroutineType ) and PyCoroObject . PyCoro_CheckExact(*o) to test if o is a native coroutine.

    While the list of changes and new things is not short, it is important to understand, that most users will not use these features directly. It is intended to be used in frameworks and libraries to provide users with convenient to use and unambiguous APIs with async def , await , async for and async with syntax.

    Working example

    All concepts proposed in this PEP are implemented [3] and can be tested.

    Acceptance

    PEP 492 was accepted by Guido, Tuesday, May 5, 2015 [14].

    Implementation

    The implementation is tracked in issue 24017 [15]. It was committed on May 11, 2015.

    References

    Acknowledgments

    I thank Guido van Rossum, Victor Stinner, Elvis Pranskevichus, Andrew Svetlov, Łukasz Langa, Greg Ewing, Stephen J. Turnbull, Jim J. Jewett, Brett Cannon, Nick Coghlan, Steven D’Aprano, Paul Moore, Nathaniel Smith, Ethan Furman, Stefan Behnel, Paul Sokolovsky, Victor Petrovykh, and many others for their feedback, ideas, edits, criticism, code reviews, and discussions around this PEP.

    Copyright

    This document has been placed in the public domain.

    Источник

    The async function declaration declares an async function where the await keyword is permitted within the function body. The async and await keywords enable asynchronous, promise-based behavior to be written in a cleaner style, avoiding the need to explicitly configure promise chains.

    Async functions may also be defined as
    expressions.

    Try it

    Syntax

    async function name(param0) {
      statements
    }
    async function name(param0, param1) {
      statements
    }
    async function name(param0, param1, /* … ,*/ paramN) {
      statements
    }
    

    Parameters

    name

    The function’s name.

    param Optional

    The name of an argument to be passed to the function.

    statements Optional

    The statements comprising the body of the function. The await
    mechanism may be used.

    Return value

    A Promise which will be resolved with the value returned by the async
    function, or rejected with an exception thrown from, or uncaught within, the async
    function.

    Description

    Async functions can contain zero or more await expressions. Await expressions make promise-returning functions behave as though they’re synchronous by suspending execution until the returned promise is fulfilled or rejected. The resolved value of the promise is treated as the return value of the await expression. Use of async and await enables the use of ordinary try / catch blocks around asynchronous code.

    Note: The await keyword is only valid inside async functions within regular JavaScript code. If you use it outside of an async function’s body, you will get a SyntaxError.

    await can be used on its own with JavaScript modules.

    Note: The purpose of async/await is to simplify the syntax
    necessary to consume promise-based APIs. The behavior
    of async/await is similar to combining generators and
    promises.

    Async functions always return a promise. If the return value of an async function is
    not explicitly a promise, it will be implicitly wrapped in a promise.

    For example, consider the following code:

    async function foo() {
      return 1;
    }
    

    It is similar to:

    function foo() {
      return Promise.resolve(1);
    }
    

    Note:

    Even though the return value of an async function behaves as if it’s wrapped in a Promise.resolve, they are not equivalent.

    An async function will return a different reference, whereas Promise.resolve returns the same reference if the given value is a promise.

    It can be a problem when you want to check the equality of a promise and a return value of an async function.

    const p = new Promise((res, rej) => {
      res(1);
    });
    
    async function asyncReturn() {
      return p;
    }
    
    function basicReturn() {
      return Promise.resolve(p);
    }
    
    console.log(p === basicReturn()); // true
    console.log(p === asyncReturn()); // false
    

    The body of an async function can be thought of as being split by zero or more await
    expressions. Top-level code, up to and including the first await expression (if there is
    one), is run synchronously. In this way, an async function without an await expression
    will run synchronously. If there is an await expression inside the function body,
    however, the async function will always complete asynchronously.

    For example:

    async function foo() {
      await 1;
    }
    

    It is also equivalent to:

    function foo() {
      return Promise.resolve(1).then(() => undefined);
    }
    

    Code after each await expression can be thought of as existing in a .then
    callback. In this way a promise chain is progressively constructed with each reentrant
    step through the function. The return value forms the final link in the chain.

    In the following example, we successively await two promises. Progress moves through
    function foo in three stages.

    1. The first line of the body of function foo is executed synchronously,
      with the await expression configured with the pending promise. Progress through
      foo is then suspended and control is yielded back to the function that
      called foo.
    2. Some time later, when the first promise has either been fulfilled or rejected,
      control moves back into foo. The result of the first promise fulfillment
      (if it was not rejected) is returned from the await expression. Here 1 is
      assigned to result1. Progress continues, and the second await expression
      is evaluated. Again, progress through foo is suspended and control is
      yielded.
    3. Some time later, when the second promise has either been fulfilled or rejected,
      control re-enters foo. The result of the second promise resolution is
      returned from the second await expression. Here 2 is assigned to
      result2. Control moves to the return expression (if any). The default
      return value of undefined is returned as the resolution value of the
      current promise.
    async function foo() {
      const result1 = await new Promise((resolve) =>
        setTimeout(() => resolve("1")),
      );
      const result2 = await new Promise((resolve) =>
        setTimeout(() => resolve("2")),
      );
    }
    foo();
    

    Note how the promise chain is not built-up in one go. Instead, the promise chain is
    constructed in stages as control is successively yielded from and returned to the async
    function. As a result, we must be mindful of error handling behavior when dealing with
    concurrent asynchronous operations.

    For example, in the following code an unhandled promise rejection error will be thrown,
    even if a .catch handler has been configured further along the promise
    chain. This is because p2 will not be «wired into» the promise chain until
    control returns from p1.

    async function foo() {
      const p1 = new Promise((resolve) => setTimeout(() => resolve("1"), 1000));
      const p2 = new Promise((_, reject) => setTimeout(() => reject("2"), 500));
      const results = [await p1, await p2]; // Do not do this! Use Promise.all or Promise.allSettled instead.
    }
    foo().catch(() => {}); // Attempt to swallow all errors...
    

    async function declarations are hoisted to the top of their scope and can be called anywhere in their scope.

    Examples

    Async functions and execution order

    function resolveAfter2Seconds() {
      console.log("starting slow promise");
      return new Promise((resolve) => {
        setTimeout(() => {
          resolve("slow");
          console.log("slow promise is done");
        }, 2000);
      });
    }
    
    function resolveAfter1Second() {
      console.log("starting fast promise");
      return new Promise((resolve) => {
        setTimeout(() => {
          resolve("fast");
          console.log("fast promise is done");
        }, 1000);
      });
    }
    
    async function sequentialStart() {
      console.log("==SEQUENTIAL START==");
    
      // 1. Execution gets here almost instantly
      const slow = await resolveAfter2Seconds();
      console.log(slow); // 2. this runs 2 seconds after 1.
    
      const fast = await resolveAfter1Second();
      console.log(fast); // 3. this runs 3 seconds after 1.
    }
    
    async function concurrentStart() {
      console.log("==CONCURRENT START with await==");
      const slow = resolveAfter2Seconds(); // starts timer immediately
      const fast = resolveAfter1Second(); // starts timer immediately
    
      // 1. Execution gets here almost instantly
      console.log(await slow); // 2. this runs 2 seconds after 1.
      console.log(await fast); // 3. this runs 2 seconds after 1., immediately after 2., since fast is already resolved
    }
    
    function concurrentPromise() {
      console.log("==CONCURRENT START with Promise.all==");
      return Promise.all([resolveAfter2Seconds(), resolveAfter1Second()]).then(
        (messages) => {
          console.log(messages[0]); // slow
          console.log(messages[1]); // fast
        },
      );
    }
    
    async function parallel() {
      console.log("==PARALLEL with await Promise.all==");
    
      // Start 2 "jobs" in parallel and wait for both of them to complete
      await Promise.all([
        (async () => console.log(await resolveAfter2Seconds()))(),
        (async () => console.log(await resolveAfter1Second()))(),
      ]);
    }
    
    sequentialStart(); // after 2 seconds, logs "slow", then after 1 more second, "fast"
    
    // wait above to finish
    setTimeout(concurrentStart, 4000); // after 2 seconds, logs "slow" and then "fast"
    
    // wait again
    setTimeout(concurrentPromise, 7000); // same as concurrentStart
    
    // wait again
    setTimeout(parallel, 10000); // truly parallel: after 1 second, logs "fast", then after 1 more second, "slow"
    

    await and parallelism

    In sequentialStart, execution suspends 2 seconds for the first
    await, and then another second for the second await. The
    second timer is not created until the first has already fired, so the code finishes
    after 3 seconds.

    In concurrentStart, both timers are created and then awaited.
    The timers run concurrently, which means the code finishes in 2 rather than 3 seconds,
    i.e. the slowest timer.
    However, the await calls still run in series, which means the second
    await will wait for the first one to finish. In this case, the result of
    the fastest timer is processed after the slowest.

    If you wish to safely perform two or more jobs in parallel, you must await a call
    to Promise.all,
    or
    Promise.allSettled.

    Warning: The functions concurrentStart and concurrentPromise
    are not functionally equivalent.

    In concurrentStart, if promise fast rejects before promise
    slow is fulfilled, then an unhandled promise rejection error will be
    raised, regardless of whether the caller has configured a catch clause.

    In concurrentPromise, Promise.all wires up the promise
    chain in one go, meaning that the operation will fail-fast regardless of the order of
    rejection of the promises, and the error will always occur within the configured
    promise chain, enabling it to be caught in the normal way.

    Rewriting a Promise chain with an async function

    An API that returns a Promise will result in a promise chain, and it
    splits the function into many parts. Consider the following code:

    function getProcessedData(url) {
      return downloadData(url) // returns a promise
        .catch((e) => downloadFallbackData(url)) // returns a promise
        .then((v) => processDataInWorker(v)); // returns a promise
    }
    

    it can be rewritten with a single async function as follows:

    async function getProcessedData(url) {
      let v;
      try {
        v = await downloadData(url);
      } catch (e) {
        v = await downloadFallbackData(url);
      }
      return processDataInWorker(v);
    }
    

    Alternatively, you can chain the promise with catch():

    async function getProcessedData(url) {
      const v = await downloadData(url).catch((e) => downloadFallbackData(url));
      return processDataInWorker(v);
    }
    

    In the two rewritten versions, notice there is no await statement after the
    return keyword, although that would be valid too: The return value of an
    async function is implicitly wrapped in Promise.resolve — if
    it’s not already a promise itself (as in the examples).

    Specifications

    Specification
    ECMAScript Language Specification
    # sec-async-function-definitions

    Browser compatibility

    BCD tables only load in the browser

    See also

    Понравилась статья? Поделить с друзьями:
  • Svchost exe выдает ошибку
  • Svchost error application error
  • Synology как изменить тип raid
  • Synology cloud sync яндекс диск ошибка при авторизации
  • Synergy one ui error 1337