Adaptive Cards, Desktop Bridge and Timeline in Windows 10


Adaptive Cards is a relatively new concept that has been introduced by Microsoft and it’s all about rendering some visual content using a standard JSON format. A card is a collection of information that you want to display to your users and it can include images, text, complex layouts and even actions, like in the following sample:

image

Adaptive cards, other than empowering many multi-device experiences (apps, bots, websites, etc.), are leveraged also by a new upcoming feature of Windows 10 called Timeline. In this post we’re going to combine these two concepts (Adaptive Cards and Timeline) inside a WPF application packaged with the Desktop Bridge. This way, we can see how also a traditional desktop application can participate in the Timeline experience and enhance the user’s productivity.

What is Timeline?

Do you remember Project Rome? It’s a set of infrastructure services, which support is built-in in UWP starting from Windows 10 Creators Update, that allows to implement cross-device experiences. Thanks to these services and the corresponding SDKs (which are available for all the major platforms, like Xamarin, iOS and Android), you can define tasks that can be started on a device and resumed on another. For example, you can start editing a document on your phone and then, once you are in front of your PC, resuming the job on a bigger screen. Project Rome is powered by Microsoft Graph, a very powerful set of REST APIs that you can use to connect your application to the user’s data that lives in Office 365: mails, people, calendars, files, etc. Last year, with the goal to empower cross-device activities, the team has added two new endpoints to the Graph APIs: devices and activities (you can read more here https://developer.microsoft.com/en-us/graph/docs/api-reference/beta/resources/project_rome_overview). Thanks to these new entry points you are able to leverage the Microsoft Graph APIs to query all the devices that belong to the user and to add new activities to his history. This way you can track an activity (let’s say, writing a Word document) and the device where it happened (let’s say, your Surface).

To further expand the cross-device experiences in Windows 10, the team is adding a new feature in the next update (codename RS4) called Timeline. Whenever you perform an action with an application supported by Project Rome, like browsing a website in Edge or opening an Office document, Windows tracks it and displays it in the timeline of activities performed by the user. The timeline is related to the account’s user, not to the device he’s using, so for example on your PC you will be able to see a website you have opened on your phone or a document you have worked on another PC. Timeline is the replacement of the old Task View in Windows 10 and, in fact, if you install the latest Insider Preview, you will immediately notice a new icon near the Start Menu and Cortana:

SNAGHTML683d17c

When you click on it you will still have access to the traditional task view feature (which means that you will continue to see all the opened apps and your virtual desktops), but below you will also see all the past activities structured like a real timeline. All the activities will be sorted from the newest to the oldest and they will be split in different groups, based on the date and time when it happened. You will also have the option to search through them.

2018-03-11

How the Timeline feature is connected to Adaptive Cards? As we’re going to see, from a visual point of view, an entry in Timeline is represented exactly by an Adaptive Card.

Adaptive cards

Adaptive Cards are made by two components:

  1. The JSON payload, which is agnostic and it just describes the kind of content you want to include. It doesn’t have any reference to the layout neither to the platform where the card is going to be rendered. Here is, as an example, the JSON payload of the card you’ve seen at the beginning of the post:
    {
      "type": "AdaptiveCard",
      "version": "1.0",
      "body": [
        {
          "type": "TextBlock",
          "size": "medium",
          "text": "Choosing the right icon for the Store in a UWP or Desktop Bridge app",
          "wrap": true
        },
        {
          "type": "ColumnSet",
          "columns": [
            {
              "type": "Column",
              "width": "auto",
              "items": [
                {
                  "type": "Image",
                  "size": "small",
                  "style": "person",
                  "url": "https://pbs.twimg.com/profile_images/587911661526327296/ZpWZRPcp_400x400.jpg"
                }
              ]
            },
            {
              "type": "Column",
              "width": "stretch",
              "items": [
                {
                  "type": "TextBlock",
                  "weight": "bolder",
                  "text": "Matteo Pagani",
                  "wrap": true
                },
                {
                  "type": "TextBlock",
                  "isSubtle": true,
                  "text": "09/03/2018",
                  "wrap": true,
                  "spacing": "none",
                  "separation": "none"
                }
              ]
            }
          ]
        },
        {
          "type": "TextBlock",
          "text": "The manifest editor included in Visual Studio 2017 for UWP or Desktop Bridge apps is a great starting point to handle the various assets of your appli...",
          "wrap": true
        }
      ],
      "actions": [
        {
          "type": "Action.OpenUrl",
          "url": "https://blogs.msdn.microsoft.com/appconsult/2018/03/09/choosing-the-right-icon-for-the-store-in-a-uwp-or-desktop-bridge-app/",
          "title": "View"
        }
      ]
    }
    
  2. The renderer, which is a component that takes the JSON and converts it into a visual element. The render, on the other side, is tight to a specific platform, since it leverages the native UI components to create the visual layout. Up to today, there are multiple renderers ready to be used for the most important platforms: UWP, WPF, Web (both client-side in JavaScript and server-side in .NET), Android and iOS (both native or with Xamarin). They can all be downloaded from the official GitHub page, since the SDK is open source. If you need to support a platform for which there isn’t a SDK yet, you have also the option to implement your own renderer.

Actually, in order to properly support the Timeline feature, we don’t really need a renderer, since it’s Windows that automatically takes care of the rendering. All we need is a proper JSON file and the dedicated UWP APIs to create new user activities. However, for the purpose of our demo, we’re going also to render the adaptive card inside a WPF application, so that you can offer a seamless user experience between the application and the Timeline.

Create your own adaptive card

In order to create an adaptive card you don’t need anything special, since it’s just a JSON payload. Here are two great references to get you up and running:

  1. Adaptive Cards Explorer, which contains all the details on how the JSON file for an Adaptive Card is structured: http://adaptivecards.io/explorer/
  2. Adaptive Cars Visualizer, which offers a real-time preview of the JSON file you paste in the main window: http://adaptivecards.io/visualizer/

In a real scenario, probably the JSON payload would be delivered by a remote source. For example, for my own experiments, I’ve setup an Azure Function which parses the RSS feed of this blog and returns the latest post as an adaptive card. You can see the output of the function by simply opening the following URL in your favorite browser: https://adaptivecard.azurewebsites.net/api/AppConsultAdaptiveCards?code=AzSEpdNE/P0c9OFIBjro2vSKwGIlLdBWdc53/jmR7Y9PX2l1Ks0/nQ== This isn’t the right post to discuss the details of Azure Functions. However, for the sake of the article, it’s important just to know that in our scenario it acts like a normal REST API. Whenever a client performs a GET request against the URL of the service, the function will download the latest RSS feed, retrieve the last news, create an adaptive card from it and returns the JSON that describes it to the caller.

In my scenario, since my Azure Function is based on .NET Core 2.0, I’m able to leverage the official SDK instead of manually creating the JSON payload. The official SDK is simply available through a NuGet package called AdaptiveCards (https://www.nuget.org/packages/AdaptiveCards). Thanks to this SDK it’s easier to avoid making mistakes during the composition of the JSON payload since you work with objects and classes which, at the end of the process, are converted into JSON. Here is, for example, how the previous JSON is created using the Adaptive Cards SDK:

public static IActionResult Run([HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)]HttpRequest req, TraceWriter log)
{
    log.Info("C# HTTP trigger function processed a request.");

    FeedParser feedParser = new FeedParser();
    List<Item> items = feedParser.Parse("https://blogs.msdn.microsoft.com/appconsult/feed/", FeedType.RSS);

    var lastNews = items.OrderByDescending(x => x.PublishDate).FirstOrDefault();

    AdaptiveCard card = new AdaptiveCard();

    AdaptiveTextBlock title = new AdaptiveTextBlock
    {
        Text = lastNews.Title,
        Size = AdaptiveTextSize.Medium,
        Wrap = true
    };

    AdaptiveColumnSet columnSet = new AdaptiveColumnSet();

    AdaptiveColumn photoColumn = new AdaptiveColumn
    {
        Width = "auto"
    };
    AdaptiveImage image = new AdaptiveImage
    {
        Url = new Uri("https://pbs.twimg.com/profile_images/587911661526327296/ZpWZRPcp_400x400.jpg"),
        Size = AdaptiveImageSize.Small,
        Style = AdaptiveImageStyle.Person
    };
    photoColumn.Items.Add(image);

    AdaptiveTextBlock name = new AdaptiveTextBlock
    {
        Text = "Matteo Pagani",
        Weight = AdaptiveTextWeight.Bolder,
        Wrap = true
    };

    AdaptiveTextBlock date = new AdaptiveTextBlock
    {
        Text = lastNews.PublishDate.ToShortDateString(),
        IsSubtle = true,
        Spacing = AdaptiveSpacing.None,
        Wrap = true
    };

    AdaptiveColumn authorColumn = new AdaptiveColumn
    {
        Width = "stretch"
    };
    authorColumn.Items.Add(name);
    authorColumn.Items.Add(date);

    columnSet.Columns.Add(photoColumn);
    columnSet.Columns.Add(authorColumn);

    AdaptiveTextBlock body = new AdaptiveTextBlock
    {
        Text = $"{lastNews.Content.Substring(0, 150)}...",
        Wrap = true
    };

    AdaptiveOpenUrlAction action = new AdaptiveOpenUrlAction
    {
        Url = new Uri(lastNews.Link),
        Title = "View"
    };

    card.Body.Add(title);
    card.Body.Add(columnSet);
    card.Body.Add(body);
    card.Actions.Add(action);

    string json = card.ToJson();

    return new OkObjectResult(json);
}

For the sake of this post you can just ignore the first lines of code. It’s simply the definition of an Azure Function: whenever the URL will be invoked by a client, the Run() method will be invoked. The function downloads the latest version of the RSS feed of this blog, takes the most recent news and builds an adaptive card out of it. As you can notice, every element of the JSON you’ve seen before has been converted into an object: the TextBlock type is represented by the AdaptiveTextBlock class; the ColumnSet type is represented by the AdaptiveColumnSet class, which includes two AdaptiveColumn objects; the Image type is represented by the AdaptiveImage class; etc. All these objects are added to the Body collection of an AdaptiveCard object, which then is rendered into a JSON file by calling the ToJson() method.

For the purpose of the Timeline implementation you can also simply decide to copy and paste your own JSON into a file to be included inside your application and to load it at runtime. However, for my sample, I’ve decided to leverage a web service to show you how the same Adaptive Card can be delivered to multiple experiences and platforms. You will see an example at the end of the post.

Rendering the Adaptive Card inside your application

Since in this post we’re going to see how to leverage Timeline in a Desktop Bridge application, we’re going to start from a WPF project. Its purpose will be, initially, just to render the latest post from the AppConsult blog using the Adaptive Card returned by our Azure Function. At a later stage, we’ll add also the code needed to add the same card to our Timeline.

After we have created our WPF project in Visual Studio, we need to add using NuGet (right click on the project and choose Manage NuGet packages) a package called AdaptiveCards.Rendering.Wpf (https://www.nuget.org/packages/AdaptiveCards.Rendering.Wpf). This package will give us access, other to the same set of APIs we’ve seen in the Azure Function project to create and manipulate adaptive cards, also a specific renderer for WPF. Thanks to this, we’ll be able to pass the JSON as input and return back a ready-to-use XAML control.

Let’s start with the UI of the application, which is really simple: just a panel with, inside, a button to trigger the operation.

<Window x:Class="AdaptiveCards.WPF.MainWindow"
        xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
        xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
        xmlns:d="http://schemas.microsoft.com/expression/blend/2008"
        xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006"
        xmlns:local="clr-namespace:AdaptiveCards.WPF"
        mc:Ignorable="d"
        Loaded="OnWindowLoaded"
        Title="MainWindow" Height="450" Width="800">
    <StackPanel x:Name="MainPanel">
        <Button Content="Render adaptive card" Click="OnRenderAdaptiveCard" />
    </StackPanel>
</Window>

Here is what happens when you click on the button:

private async void OnRenderAdaptiveCard(object sender, RoutedEventArgs e)
{
    AdaptiveCardRenderer renderer = new AdaptiveCardRenderer();

    HttpClient client = new HttpClient();
    string json = await client.GetStringAsync("https://adaptivecard.azurewebsites.net/api/AppConsultAdaptiveCards?code=AzSEpdNE/P0c9OFIBjro2vSKwGIlLdBWdc53/jmR7Y9PX2l1Ks0/nQ==");
    AdaptiveCardParseResult card = AdaptiveCard.FromJson(json);

    var renderResult = renderer.RenderCard(card.Card);

    if (renderResult != null)
    {
        MainPanel.Children.Add(renderResult.FrameworkElement);
    }
}

First we setup a new AdaptiveCardRenderer object, which is included in the AdaptiveCards.Rendering.Wpf namespace. Then, using the HttpClient class, we perform a HTTP GET operation against my Azure Function, using the GetStringAsync() method. We’ll get back a JSON string with our adaptive card. Once we have it, we rebuild an AdaptiveCard object by using the AdaptiveCard.FromJson() method, passing as parameter the JSON we have just downloaded. As last step, we call the RenderCard() method of the renderer, passing as parameter the card we have just obtained (which is stored inside the Card property of the AdaptiveCardParseResult object). Since we’re working with the WPF renderer, the result of the rendering will directly expose a FrameworkElement property, which contains our Adaptive Card rendered using XAML. Now we can just add this control to the StackPanel in the main page to see it displayed in the main window:

image

Nice, isn’t it?

Adding support to Timeline

All the code we have wrote so far works also in a traditional desktop application. What we have built is a standard WPF application, with no UWP or Desktop Bridge involvement. The usage of Adaptive Cards, in fact, isn’t specific of the UWP world, but they can be leveraged on any platform since it’s just a way to visually describe a JSON payload in a consistent way. However, when we start to add Timeline to the story, we start to step into the UWP world. Being based on the Microsoft Graph APIs, you would have the chance to leverage this feature also in a regular desktop application or in a website, but thanks to the UWP APIs you can provide a much better user experience. The Microsoft Graph APIs, in fact, require the user to login as first thing. The cross-device experience is powered by the cloud and the Office 365 infrastructure, so the user must be logged in with his Microsoft Account in order to leverage it. Thanks to the UWP APIs, instead, you can skip this step: they will automatically leverage the Microsoft Account already linked to the current Windows user.

Please note: in order to test the following code, you need to be on RS4 which means, at the time of writing, joining the Windows Insider Program. However, the APIs we’re going to use are included also in RS3 so you don’t need to install and target the RS4 SDK.

Let’s start to change our application so that, whenever we choose to read a blog post by clicking on the View button in the card, it’s tracked as a user activity and displayed inside the Timeline view. The first step is to add a Windows Application Packaging Project to our solution. You can find it by right click on your solution, choosing New project and moving to the Windows Universal section:

image

Thanks to this project, you’ll be easily able to package your WPF application using the Desktop Bridge. This is a required step because the Timeline APIs are part of the UWP ecosystem and they require the application to have an identity. As such, a traditional WPF application would simply start returning exceptions while using them. I won’t explain here the details on how to setup the Windows Application Packaging Project, since I’ve already written a post on this.

Once you have added the project to your solution, we can start to add the code needed to turn our adaptive card into a Timeline entry. For fairness, the code you’re going to see isn’t exclusive for the Desktop Bridge. It’s the same exact one you would leverage in a UWP app. You can take a look at the following post by Shen Chauhan for a great overview on how to implement Timeline in a UWP application. However, in case of a Desktop Bridge application, we need to take two additional steps:

  1. First we need to add a reference to the Universal Windows Platform APIs, which aren’t included by default. You can find an overview of the steps to take in the following blog post.
  2. This is optional, but suggested. Using NuGet, add to your WPF project the package called DesktopBridge.Helpers (https://www.nuget.org/packages/DesktopBridge.Helpers/), which is described in the following blog post. Thanks to this library, we’ll be able to invoke the Timeline APIs only if the application is running packaged using the Desktop Bridge.

Now we can define a set of properties, at class level, in the MainPage.xaml.cs file, which we’re going to reuse in our code:

private UserActivityChannel _userActivityChannel;
private UserActivity _userActivity;
private UserActivitySession _userActivitySession;
private Helpers _desktopBridgeHelpers;
string json;

The first three ones are part of the UWP framework and they belong to the namespace Windows.ApplicationModel.UserActivities. They’re the ones that we will use to track an operation as a user activity, which is one of two new Microsoft Graph endpoints we have introduced at the beginning of the post. The last property is part of the NuGet package DesktopBridge.Helpers we have previously installed. We’re going to store also the JSON downloaded from the Azure Function at class level, since we’re going to need it in multiple event handlers.

As first thing, we need to subscribe to the OnWindowLoaded() event of the window, in order to perform some initialization operations when the application starts:

private async void OnWindowLoaded(object sender, RoutedEventArgs e)
{
    if (_desktopBridgeHelpers.IsRunningAsUwp())
    {
        _userActivityChannel = UserActivityChannel.GetDefault();
        _userActivity = await _userActivityChannel.GetOrCreateUserActivityAsync("NewBlogPost");
    }
}

Thanks to this code we initialize a UserActivityChannel, which represent a single entity we want to track. Think of it like an Office file on OneDrive or a mail in Outlook. In this case, however, it will be a customized content, specifically the information about a blog post we have decided to open inside our WPF application. The initialization is performed only if the application is running packaged with the Desktop Bridge (so the IsRunningAsUwp() method returns true). Now we need to change a bit the code we invoke when we click on the button to render our adaptive card:

private async void OnRenderAdaptiveCard(object sender, RoutedEventArgs e)
{
    AdaptiveCardRenderer renderer = new AdaptiveCardRenderer();

    HttpClient client = new HttpClient();
    json = await client.GetStringAsync("https://adaptivecard.azurewebsites.net/api/AppConsultAdaptiveCards?code=AzSEpdNE/P0c9OFIBjro2vSKwGIlLdBWdc53/jmR7Y9PX2l1Ks0/nQ==");
    AdaptiveCardParseResult card = AdaptiveCard.FromJson(json);

    var renderResult = renderer.RenderCard(card.Card);
    renderResult.OnAction += RenderResult_OnAction;

    if (renderResult != null)
    {
        MainPanel.Children.Add(renderResult.FrameworkElement);
    }
}

Compared to the previous implementation, we have subscribed to an event called OnAction exposed by the RenderedAdaptiveCard object. If you remember the JSON of the adaptive card, you will have noticed that you can define not just visual items but also custom actions:

"actions": [
   {
     "type": "Action.OpenUrl",
     "url": "https://blogs.msdn.microsoft.com/appconsult/2018/03/09/choosing-the-right-icon-for-the-store-in-a-uwp-or-desktop-bridge-app/",
     "title": "View"
   }
 ]

You have multiple action types to choose from. In this case, since we want to open the browser and display the full blog post, we use the Action.OpenUrl type. Additionally, we specify the url property (the URL to open) and the title property (the label of the button).

However, the implementation of these actions doesn’t come for free. If you remember, one of the strenghts of Adaptive Cards is that they’re agnostic of the platform where they’re being rendered, since it’s just JSON. As such, it’s up to you to handle the action in the most appropriate way based on the platform where the card has been rendered. In case of a WPF application, the way to handle this is to subscribe to the OnAction event and react accordingly. Here is how we can handle the event:

private async void RenderResult_OnAction(RenderedAdaptiveCard sender, AdaptiveActionEventArgs e)
{
    if (e.Action.Type == "Action.OpenUrl")
    {
        var action = e.Action as AdaptiveOpenUrlAction;
        Process.Start(action.Url.ToString());
    }
}

We detect if this is the kind of action we expect (we could have, in fact, multiple buttons with multiple actions) and, if affirmative, we convert the argument of the event handler to an AdaptiveOpenUrlAction type. Thanks to this object we can retrieve the URL using the Url property and pass it to the Process.Start() method, which will simply open a new instance of your default browser on the specified website. This is the place where we want to add the Timeline support: whenever the user will choose to read the full blog post, we will track it as a user activity. Here is the new code:

private async void RenderResult_OnAction(RenderedAdaptiveCard sender, AdaptiveActionEventArgs e)
{
    if (e.Action.Type == "Action.OpenUrl")
    {
        if (_desktopBridgeHelpers.IsRunningAsUwp())
        {
            _userActivity.ActivationUri = new Uri("adaptivecards://openLastPost");
            _userActivity.VisualElements.DisplayText = "Windows AppConsult blog";
            _userActivity.VisualElements.Content = AdaptiveCardBuilder.CreateAdaptiveCardFromJson(json);

            await _userActivity.SaveAsync();
            _userActivitySession?.Dispose();
            _userActivitySession = _userActivity.CreateSession();
        }

        var action = e.Action as AdaptiveOpenUrlAction;
        Process.Start(action.Url.ToString());
    }
}

Like we did when the application was loaded, we perform these operations only if we are running as packaged with the Desktop Bridge, otherwise the UWP APIs will return an exception. If we are in the right context, we initialize the UserActivity object by setting:

  • An ActivationUri: this is a protocol that is invoked when the user selects the Timeline entry. Thanks to this URI you can understand the activation context and act accordingly. For example, you could open the application and automatically redirect the user to the full blog post.
  • The visual look & feel through the VisualElements property. The two key elements are:
    • DisplayText, which is the title displayed on top of the Adaptive Card in Timeline
    • Content, which is the Adaptive Card to render. You can get the right version by using the AdaptiveCardBuilder class (which is the specific UWP renderer, in fact it’s part of the Windows.UI.Shell namespace) and the CreateAdaptiveCardFromJson() method, which takes as input the JSON we have previously downloaded using our Azure Function.


Once we have defined the UserActivity, we save it to the Microsoft Graph by calling the SaveAsync() method. Then we dispose the current UserActivitySession and we initialize a new one by invoking the CreateSession() method on the UserActivity object. This way, we are ready to register a new user activity in case the user clicks on another blog post. That’s all! If we did everything in the proper way, now you can open the Timeline on your Windows machine and you should see this:

image

Interacting with timeline

As last step, if we want to deliver a great user experience, we need to handle the Timeline interaction. If we would just open our application when the user clicks on the Adaptive Card in Timeline, it would be a very poor experience. The user doesn’t expect just to return to our app, but to the exact task he was doing and for which we have created a user activity. In our case, we’re going to directly open in the browser the blog post the user has chosen to read. As we have seen in the previous code we have implemented, we can handle Timeline interaction with protocols. When we built the UserActivity object, in fact, we have specified the following activation url: adaptivecards://openLastPost. As such, as first step, we need to register this protocol for our WPF application. Being packaged with the Desktop Bridge and having access to a full UWP manifest, it’s a super easy operation. Just double click on the Package.appxmanifest file in the Windows Application Packaging Project and, in the Declarations tab, add a Protocol item and specify adaptivecards as name:

SNAGHTMLbd133c8

Now let’s add a new window to our WPF application, which will be opened in case of activation by the protocol we have just configured. When this window is loaded, it will simply take care of downloading the most recent RSS feed, get the URL of the last blog post and open it in the browser. Then the application will quit. Let’s start by creating the window: right click on the WPF project, choose Add new item and select Window (WPF) from the available items.

Also in this new page register for the Loaded event and, in the code behind, implement the following event handler:

private async void OnWindowLoaded(object sender, RoutedEventArgs e)
{
    HttpClient client = new HttpClient();
    string json = await client.GetStringAsync("https://adaptivecard.azurewebsites.net/api/AppConsultAdaptiveCards?code=AzSEpdNE/P0c9OFIBjro2vSKwGIlLdBWdc53/jmR7Y9PX2l1Ks0/nQ==");
    AdaptiveCard card = AdaptiveCard.FromJson(json).Card;

    var action = card.Actions.FirstOrDefault();
    if (action != null)
    {
        if (action.Type == "Action.OpenUrl")
        {
            var urlAction = action as AdaptiveOpenUrlAction;
            Process.Start(urlAction.Url.ToString());
            Application.Current.Shutdown();
        }
    }
}

We’re leveraging again the Azure Function I’ve built to get the JSON of the Adaptive Card and to retrieve the URL of the most recent blog from it, thanks to the built-in action. Then we open the browser (using Process.Start()) and in the end we close the application (by using Application.Current.Shutdown()).

Now we need to change the initialization code of our WPF app. By default, in fact, a WPF application doesn’t handle startup parameters, but it’s directly opened on the page specified using the StartupUri attribute of the App class:

<Application x:Class="AdaptiveCards.WPF.App"
             xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
             xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
             xmlns:local="clr-namespace:AdaptiveCards.WPF"
             StartupUri="MainWindow.xaml">
    <Application.Resources>
         
    </Application.Resources>
</Application>

We need to change this behavior in order to show the MainWindow screen if the app is opened in the standard way or the OpenNewPage one if the app is opened from the Timeline. To do that we need to remove the StartupUri attribute from the Application tag and subscribe, instead, to the Startup event, like in the following sample:

<Application x:Class="AdaptiveCards.WPF.App"
             xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
             xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
             xmlns:local="clr-namespace:AdaptiveCards.WPF"
             StartupUri="MainWindow.xaml"
             Startup="OnStartup">
    <Application.Resources>
         
    </Application.Resources>
</Application>

This event will be invoked during the application’s startup and we will have access to the activation parameters. If we find one with value adaptivecards, we’re going to show the OpenNewPage screen; otherwise will just show the standard MainWindow one.

public partial class App : Application
{
    private void OnStartup(object sender, StartupEventArgs e)
    {
        if (e.Args.Length > 0)
        {
            if (e.Args[0].Contains("adaptivecards"))
            {
                OpenLastPost openWindow = new OpenLastPost();
                openWindow.Show();
            }
        }
        else
        {
            MainWindow mainWindow = new MainWindow();
            mainWindow.Show();
        }
    }
}

Now deploy the new version of the Desktop Bridge application and, this time, instead of launching the app, open the Task View in Windows 10 and click on the Timeline entry we’ve previously created. If we did everything well, the application will start and immediately launch the browser on the latest post published on the AppConsult blog. If you don’t see the expected result, Visual Studio can help you to debug this  scenario. Just right click on the Windows Application Packaging Project and choose Properties, then move to the Debug tab and check the option Do not launch, but debug my code when it starts.

image

This way the debugger will be attached, but the application won’t be effectively launched. This way you can place a breakpoint in the OnStartup() event handler in the App class and then click on the card in Timeline. The breakpoint will be hit and you’ll be able to better understand what’s happening.

One Adaptive Card to rule them all

The power of adaptive cards is that a single JSON payload can be used virtually on any platform and experience out there. Below you can see the same Adaptive Card being rendered:

  • In a UWP app
  • In a website
  • In a chat bot, built on top of the Microsoft Bot Framework

image

image


image

The implementation for each platform followed the same flow we used for the WPF app. At first, I’ve downloaded the JSON payload from my Azure Function; then, using the specific renderer, I turned the JSON into an Adaptive Card built with the native UI language of the target platform (XAML or HTML).

Wrapping up

In this long post we’ve seen two new powerful features introduced by Microsoft: Adaptive Cards, which is a way to visually represent information in a standard way across multiple platforms and experiences, and Timeline, which is a new Windows 10 feature to make easier to transition from one device to another. The two features, combined together, allows developers to track user activities and turn them into rich visual cards, so that users can quickly jump back to a task they have started on the same machine or on a completely different device. Timeline is a new feature that is included in the next Windows 10 major update, which is coming soon but that you can try out today by joining the Windows Insider Program. The whole source code of the samples used in this post is available on GitHub https://github.com/qmatteoq/AdaptiveCards, including the other projects where the Adaptive Card is consumed (bot, web, UWP, etc.) but that they haven’t been described in details in this post.

Happy coding!

Comments (0)

Skip to main content