Resumable functions in C++

Raman Sharma

Last year, in a CTP release, we had provided a glimpse into resumable functions and await support in the Visual C++ compiler. In Visual Studio 2015 Preview we have gotten further along that journey and provided a more general purpose solution. This Preview release provides experimental implementation for a proposal called “Resumable functions” for the ISO C++ Standard. This is still work in progress but we believe this is the right time to open up the discussion and seek design feedback. An excellent overview of the topic is already available through this CppCon video. The slides from that presentation are also available here.

As of this preview, this feature only works for x64 targets. In order to use this experimental feature you will need to include some new headers (e.g. “<experimental/resumable>“) in your source files as well as specify the switch “/await” on the compiler command-line.

This feature is built upon the concept of a coroutine which you might have encountered in other languages such as Python, Ruby etc. It is a generalized routine entity which supports operations like suspend and resume in addition to the traditional invoke and return operations. Very simply, it can be thought of as a method which, instead of returning to the caller, stops dead in the middle of processing and yields a value to the caller. The next time the coroutine is called, it resumes where it left off until it yields another value.

Here are a few code examples to get you started with the key aspects of this feature:

Asynchronous operations

The below code snippet shows how the code would look like for a function that waits for a long running operation like a computation or I/O. Note the usage of the proposed ‘__await’ keyword meant to signify waiting for the result of an asynchronous operation.

#include <future>

using namespace std;
using namespace std::chrono;
 
// this could be some long running computation or I/O
future<int> calculate_the_answer()
{
    return async([] {
        this_thread::sleep_for(1s); return 42;
    });
}
 
// Here is a resumable function
future<void> coro() {
    printf(“Started waiting… n”);
    auto result = __await calculate_the_answer();
    printf(“got %d. n”, result);
}
 
int _tmain(int argc, _TCHAR* argv[])
{
    coro().get();
}

Generator pattern

The below code snippet demonstrates the usage of the proposed ‘__yield_value’ keyword in the generator pattern where the generator coroutine is able to “yield” the values back to the calling function and can also be cancelled on demand.

#include <iostream>
#include <experimental/generator>
 
using namespace std::experimental;
using namespace std;
 
generator<int> fib()
{
 &n bsp;  int a = 0;
    int b = 1;
    for (;;) {
        __yield_value a;
        auto next = a + b;
        a = b;
        b = next;
    }
}
 
int _tmain(int argc, _TCHAR* argv[])
{
    for (v : fib()) {
        if (v > 50)
            break;
        cout << v << endl;
    }
}

Reactive Streams

The below code pattern demonstrates the usage of the proposed ‘for __await’ keyword in a scenario where a coroutine (Ticks) produces an asynchronous stream of values and a function (Sum) consumes those values. The coroutine TimeStamp demonstrates the scenario where a coroutine consumes an incoming stream, process it and outputs it to whoever is waiting for it.

//As a consumer
future<int> Sum(async_read_stream<int> & input)
{
    int result = 0;
    for __await(v : input)
    {
        result += v;
    }
    return result;
}
 
//As a producer :
async_generator<int> Ticks()
{
    for (int tick = 0;; ++tick)
    {
        __yield_value tick;
        __await sleep_for(1ms);
    }
}
 
//As a transformer : (adds a timestamp to every observed value)
template<class T>
async_generator<pair<T, system_clock::time_point>>
Timestamp(async_read_stream<T> S)
{
    for __await(v: S)

__yield_value { v, system_clock::now() };

}

These are just some of the examples of this feature. We will continue our work in this area beyond this Preview release and hope to add more coverage, better user experience and in-built support for more high-level scenarios in upcoming releases. However, we hope you will like what you have seen so far, play with this feature, find novel uses for the base concepts and functionality. We look forward to hearing all your feedback.

0 comments

Discussion is closed.

Feedback usabilla icon