Quantcast
Channel: Don't Code Tired
Viewing all 207 articles
Browse latest View live

Getting Started with Azure Event Grid

$
0
0

In a previous article we got an introduction to Azure Event Grid, if you’re new to Event Grid you should check it out first to familiarise yourself with some basic concepts.

In this article we’ll create an Azure Event Grid topic and subscription and see it in action.

First off, if you want to create a free Azure Account you can do so, then log into the Azure portal.

Next go and create a new resource group, Azure Event Grid is currently in preview and only available in selected locations such as West US 2.

Creating a new resource group

Once the resource group is created, head down to the More services option and search for Event Grid.

Navigating to Event Grid Topics

There are topics provided by Azure services (such as blob storage ) and there is also the ability to create your own custom topics for custom applications/third parties/etc.

Click Event Grid Topics and this will take you to a list of all your topics. Click the +Add button to begin creation of a custom topic. Give the topic a name sales-leads and choose the resource group created earlier, once again choose West US 2.

Creating a new Azure Event Grid Topic

Click create, wait for the deployment to complete and hit refresh in the topics list to see your new topic:

Azure Event Grid topic added

Click on the newly added sales-leads topic, notice the overview showing publish metrics:

Event Grid Topic details

At the top right hover over the Topic Endpoint and click the button to copy this to the clipboard (we’ll use this later):

Getting Event Grid Topic endpoint

In this example the copied endpoint is: https://sales-leads.westus2-1.eventgrid.azure.net/api/events

We’ll also need an access key to be able to HTTP POST to this custom topic later, to do this click the Access keys option and copy Key 1 for later use:

Getting access key for Azure Event Grid topic

Click back on Overview and click the +Event Subscription button:

Creating a new Azure Event Grid Subscription

In this example we’ll create a subscription that will call an external (to Azure) service that will mail a conference brochure to all new sales leads. In this example we are simulating a temporary extension to the sales system for a limited period during the run-up to a sales conference. This is one use case for Azure Event Grid that allows extension of a core system without needing to modify it (assuming that events are being emitted).

To simulate this external service we’ll use RequestBin which you can learn more about in this article. Once you’ve created your request bin, take a note of the Bin URL.

Creating a RequestBin URL

Fill out the new event subscription details:

  • Name: send-upcoming-conference-brochure
  • Subscribe to all event types: Untick
  • Event Types: new-sales-lead-created
  • Subscriber endpoint: https://requestb.in/1bbopge1 (this is the RequestBin URL created above)

Event subscription details

Click Create.

To recap, there is now a custom topic called sales-leads that we can publish events to at its URL: https://sales-leads.westus2-1.eventgrid.azure.net/api/events. There is also an event subscription set up for this topic but that is limited to only those events published of type new-sales-lead-created. This event subscription uses the Azure Event Grid WebHooks event handler to HTTP push events to the RequestBin URL.

To see this in action, open Postman and select POST and paste the topic URL (https://sales-leads.westus2-1.eventgrid.azure.net/api/events). Add a header called aeg-sas-key and paste in the key that was copied earlier:

Basic Postman setup

The final thing to do is define the event data that we want to publish:

[
    {
        "id": "42",
        "eventType": "new-sales-lead-created",
        "subject": "myapp/sales/leads",
        "eventTime": "2017-12-07T01:01:36+00:00",
        "data":{
            "firstName": "Jason",
            "postalAddress": "xyz"
        }
    }
]

Event JSON data

And then click Send in Postman. You should get a 200 OK response.

Heading back to the RequestBin window and refreshing the page shows the subscription working and the event being pushed to RequestBin:

RequestBin receiving Azure Event Grid event

Because the event subscription is filtered on an event type of new-sales-lead-created, if we send a different event type from Postman (e.g.: "eventType": "new-sales-lead-rejected",), the subscription won’t activate nor push the event to RequestBin.


Create Precompiled Azure Functions With Azure Event Grid Triggers

$
0
0

Visual Studio can be used to create precompiled Azure Functions using standard C# classes and tools/techniques and then they can be published to Azure.

This article assumes you’ve created the resources (resource group, Event Grid Topic, etc.) from this previous article.

In Visual Studio 2017, create a new Azure Functions project.

Next update the pre-installed Microsoft.NET.Sdk.Functions NuGet package to the latest version.

To get access to the Azure Event Grid function trigger attribute, install the Microsoft.Azure.WebJobs.Extensions.EventGrid NuGet package (this package is currently in preview/beta).

Add a new class to the project with the following code:

using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.EventGrid;
using Microsoft.Azure.WebJobs.Host;

namespace DCTDemos
{
    public static class Class1
    {
        [FunctionName("SendNewLeadWelcomeLetter")]
        public static void SendNewLeadWelcomeLetter([EventGridTrigger] EventGridEvent eventGridEvent, TraceWriter log)
        {
            log.Info($"EventGridEvent" +
                $"\n\tId:{eventGridEvent.Id}" +
                $"\n\tTopic:{eventGridEvent.Topic}" +
                $"\n\tSubject:{eventGridEvent.Subject}" +
                $"\n\tType:{eventGridEvent.EventType}" +
                $"\n\tData:{eventGridEvent.Data}");
        }
    }
}

Notice in the preceding code, the method name SendNewLeadWelcomeLetter is the same as specified in the function name attribute, this may be required due to a bug in the current preview/beta implementation – if these are different your function may not be executed when an event occurs.

Right-click on the function project and choose publish. Follow the wizard and create a new Function App and select your resource group where your Event Grid Topics is. Select West US 2 if you need to create any new Azure resources/storage account/etc..

Once deployed, head over to Azure Portal, open your new function app and select the newly deployed SendNewLeadWelcomeLetter function:

Adding an Azure Event Grid subcription for an Azure Function

At the top right select Add Event Grid subscription. And follow the wizard to create a new subscription - this will enable the new function to be triggered by an Event Grid Subscription. As part of the subscription we’ll limit the event type to new-sales-lead-created:

Adding an Azure Event Grid subcription for an Azure Function

Next go to the function app platform features tab and select Log Streaming. We can now use Postman to POST the following JSON to the Event Grid Topic we created earlier.

[
    {
        "id": "1236",
        "eventType": "new-sales-lead-created",
        "subject": "myapp/sales/leads",
        "eventTime": "2017-12-08T01:01:36+00:00",
        "data":{
            "firstName": "Amrit",
            "postalAddress": "xyz"
        }
    }
]

Head back to the streaming logs and you should see your precompiled Azure Function executing in response to the Event Grid event:

2017-12-08T06:38:25  Welcome, you are now connected to log-streaming service.

2017-12-08T06:38:49.841 Function started (Id=ec927bc1-fa15-4211-a7bd-8e593f5d4840)

2017-12-08T06:38:49.841 EventGridEvent
    Id:1234
    Topic:/subscriptions/797e1c4e-3fd4-4cd6-84b8-ef103cee8b6b/resourceGroups/DCTEGDemo/providers/Microsoft.EventGrid/topics/sales-leads
    Subject:myapp/sales/leads
    Type:new-sales-lead-created
    Data:{

  "firstName": "Amrit",

  "postalAddress": "xyz"

}

2017-12-08T06:38:49.841 Function completed (Success, Id=ec927bc1-fa15-4211-a7bd-8e593f5d4840, Duration=0ms)

 

To learn how to create precompiled Azure Functions in Visual Studio, check out my Writing and Testing Precompiled Azure Functions in Visual Studio 2017 Pluralsight course.

New Free eBook: C# 7.2: What's New Quick Start

$
0
0

My new free eBook is now available for download.

The book covers the following new features of C# 7.2:

  • Leading Digit Separator
  • Reference Semantics With Value Types
  • Non-trailing Named Arguments
  • Private Protected Access Modifier
  • Span<T> and ReadOnlySpan<T>

Free C# 7.2 eBook

You can get the book for free (or pay what you’re able to) in PDF, EPUB, and MOBI formats.

Investing In You

$
0
0

I grew up in humble surroundings, my family was for the most part “working class”, I moved around a bit as a kid, moved schools a few times, and lived in state/council housing. At one point as a child (due to some unfortunate circumstances) we lived for a short time in a “homeless” hostel – a transitional place whilst waiting for state housing to be allocated. In one area that I lived as a child I had a knife pulled on me outside a local shop, I learned then how quickly I could run! Today I live in a nice safe suburb, drive a decent car, and generally don’t have to worry too much about personal safety or not having a roof over my head. This is due to some kindnesses I’ve been shown along the way and also by investing. Investing in myself…

I recently completed reading Tony Robbins  Money Master the Game, it is a good book for those new to investing - with a few chapters being somewhat US-centric. (Other books you may find interesting if you’re just starting out your investing journey include The Little Book of Common Sense Investing and A Random Walk Down Wall Street.) While Money Master the Game contains a lot of information about how to attempt to maximize your financial returns and  ways to diversify your portfolio, in it Tony also talks about  how you can add more value.

One way to improve your financial investments is by by investing in yourself.

One nice idea is that by investing in yourself you can add more value and if you can add more value you can earn more and if you can earn more you can invest more.

I had some help and kindnesses shown to me in my journey and like everyone I’ve also some challenges to deal with along the way. Even though I come from a somewhat humble  background, and as a white heterosexual male I’ve never had to deal with prejudice, I am lucky that I have always loved to learn. I became fascinated by computers and programming from an early age and was lucky enough to borrow one for a time when I was younger. Eventually my interest and enthusiasm meant I was lucky enough to get my own machine.

Over the many years I continued to learn and was eventually privileged enough to be able to attend university to study computing. Even after starting my first job I continued to learn in my own time, in the evenings and at weekends, always interested in learning more.

As I look back now, at the time I was just following my natural curiosity, but looking back what I was really doing was investing in myself.

About 2 and a half years ago I stepped into a gym for the first time in my life. I look back now and smile, my first experience was not pleasant, I didn’t know what exercises to do, I tried bench pressing with an empty bar and wobbled all over the place, while the muscular guy next to be hoisted 50kg dumbbells to the sky. I went home feeling awful and a little stupid. Two days later I went back, and I kept going back. I devoured Arnold Schwarzenegger's Encyclopedia of Modern Bodybuilding and eventually paid for some personal training sessions to learn how to clean and press and bench press properly. Whilst I am not a shredded muscular bodybuilder, I did lose 14kgs over 2 years and add some amount of muscle mass and some strength. This is another example of investing in you, this time the physical you. Oftentimes, as developers we don’t always take the best care of ourselves, but I believe investing in the physical you carries over to the work/business you.

As the adage goes, "if you want better answers, ask better questions". One question I’m asking myself this year is: how can I continue to add more value than anyone else? As a software developer and “techie-minded”, in the past I would have thought of a question like this as being big-headed or management-speaky. But if you want to help others you need to help yourself and if you want to help yourself you need to offer value to others.

If you want better answers, ask better questions

It’s good to take a step back sometimes and ask ourselves some questions, especially as we get laser focused on the test we’re writing or the feature we’re working on or the sprint that we’re in, or the next project that might be coming along.

I’m grateful for the opportunities I’ve been given in life, I’m grateful for the challenges and failures and what I’ve learned from them, and I’m grateful for the gift of my lifelong love of learning.

Whilst somewhat dramatic, there is some truth to the phrase “if you’re not growing you’re dying” and if you want to grow you have to invest in you.

Using the Actor Model and Akka.NET for IoT Systems

$
0
0

In additional to cloud-based offerings such as Microsoft’s Azure IoT Suite, it’s also possible to create representations of Internet of Things devices using the Actor Model and Akka.NET. For example, in addition to hosting an Akka.NET actor system in the cloud, you can also have them running on-premises; for example you may have an actor system monitoring an underground railway network for a city. In this case you may already have the infrastructure in place and decide that hosting in the cloud is unnecessary or risky, and deploy to servers geographically close to the underground with some disaster recovery/backup servers located elsewhere.

The Actor Model is a good fit for IoT scenarios due to the inherent concurrency controls, fault-tolerance, and performance/scalability. Akka.NET is a “toolkit and runtime for building highly concurrent, distributed, and fault tolerant event-driven applications on .NET & Mono”[1].

In the Actor Model, the smallest unit of computation is the actor. An actor can receive messages from other actors, perform computations, manage their state, and send messages to other actors.

For IoT scenarios, actors can model the system in a number of ways. For example, for every physical device in the real world there can be an actor instance to represent that device. This means you may have many actor instances, one for each physical device. When a sensor registers a new reading (temperature, proximity, pressure, etc.) this sensor can communicate with the actor responsible for it, the actor receives a message and can update its internal state to represent the latest reading. The actor representing the device can also respond to another type of message to access the last updated value. The device actor may also be responsible for sending messages to the physical device to update it’s configuration (for example) or this functionality may be broken down into its own actor definition depending on the exact requirements. The device actor may communicate with another actor that is purely responsible for handling the network interfacing required to communicate with devices. In this way if the “network actor” crashed and restarts the actors representing the device don’t crash and lose their state.

Groups of “device actors” can created with the supervision hierarchy that the Actor Model provides to protect groups of actors/devices from crashing other groups of actors/devices. These hierarchies can also provide device management semantics such as the registering of a new device that has just come online. Actors can also represent “abstract” concepts such as the creation of a dedicated actor to perform querying of groups of actors.

To learn more about using Akka.NET with IoT scenarios, check out my Representing IoT Systems with the Actor Model and Akka.NET Pluralsight course.

Dynamic Binding in Azure Functions with Imperative Runtime Bindings

$
0
0

When creating precompiled Azure Functions, bindings (such as a blob output bindings) can be declared in the function code, for example the following code defines a blob output binding:

[Blob("todo/{rand-guid}")]

This binding creates a new blob with a random (GUID) name. This style of binding is called declarative binding, the binding details are declared as part of the binding attribute.

In addition to declarative binding, Azure Functions also offers imperative binding. With this style of binding, the details of the binding can be chosen at runtime. These details could be derived from the incoming function trigger data or from an external place such as a configuration value or database item

To create imperative bindings, rather than using a specific binding attribute, a parameter of type IBinder is used. At runtime, a binding can be created (such as a blob binding, queue binding, etc.) using this IBinder. The Bind<T> method of the IBinder can be used with T representing an input/output type that is supported by the binding you intend to use.

The following code shows imperative binding in action. In this example blobs are created and the blob path is derived from the incoming JSON data, namely the category.

public static class CreateToDoItem
{
    [FunctionName("CreateToDoItem")]
    public static async Task<HttpResponseMessage> Run(
        [HttpTrigger(AuthorizationLevel.Function, "post", Route = null)]HttpRequestMessage req,
        IBinder binder,
        TraceWriter log)
    {
        ToDoItem item = await req.Content.ReadAsAsync<ToDoItem>();
        item.Id = Guid.NewGuid().ToString();

        BlobAttribute dynamicBlobBinding = new BlobAttribute(blobPath: $"todo/{item.Category}/{item.Id}");

        using (var writer = binder.Bind<TextWriter>(dynamicBlobBinding))
        {
            writer.Write(JsonConvert.SerializeObject(item));
        }

        return req.CreateResponse(HttpStatusCode.OK, "Added " + item.Description);
    }
}

If the following 2 POSTS are made:

{
    "Description" : "Lift weights",
    "Category" : "Gym"
}
{
    "Description" : "Feed the dog",
    "Category" : "Home"
}

Then 2 blobs will be output with the following paths - note the random filenames and imperatively-bound paths: Gym and Home :

http://127.0.0.1:10000/devstoreaccount1/todo/Gym/5dc4eb72-0ae6-42fc-9a8b-f4bf646dcd28

http://127.0.0.1:10000/devstoreaccount1/todo/Home/530373ef-02bc-4200-a4e7-948448ac081b

Automatic Input Blob Binding in Azure Functions from Queue Trigger Message Data

$
0
0

Reading additional blob content when an Azure Function is triggered can be accomplished by using an input blob binding by defining a parameter in the function run method and decorating it with the [Blob] attribute.

For example, suppose you have a number of blobs that need converting in some way. You could initiate a process whereby the list of blob files that need processing are added to a storage queue. Each queue message contains the name of the blob that needs processing. This would allow the conversion function to scale out to convert multiple blobs in parallel.

The following code demonstrates one approach to do this. The code is triggered from a queue message that contains text representing the input bob filename that needs reading, converting, and then outputting to an output blob container.

using System.IO;
using Microsoft.Azure.WebJobs;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Blob;

namespace FunctionApp1
{
    public static class ConvertNameCase
    {
        [FunctionName("ConvertNameCase")]
        public static void Run([QueueTrigger("capitalize-names")]string inputBlobPath)
        {
            string originalName = ReadInputName(inputBlobPath);

            var capitalizedName = originalName.ToUpperInvariant();

            WriteOutputName(inputBlobPath, capitalizedName);
        }
        
        private static string ReadInputName(string blobPath)
        {
            CloudStorageAccount account = CloudStorageAccount.DevelopmentStorageAccount;
            CloudBlobClient blobClient = account.CreateCloudBlobClient();
            CloudBlobContainer container = blobClient.GetContainerReference("names-in");

            var blobReference = container.GetBlockBlobReference(blobPath);

            string originalName = blobReference.DownloadText();

            return originalName;
        }

        private static void WriteOutputName(string blobPath, string capitalizedName)
        {
            CloudStorageAccount account = CloudStorageAccount.DevelopmentStorageAccount;
            CloudBlobClient blobClient = account.CreateCloudBlobClient();
            CloudBlobContainer container = blobClient.GetContainerReference("names-out");

            CloudBlockBlob cloudBlockBlob = container.GetBlockBlobReference(blobPath);
            cloudBlockBlob.UploadText(capitalizedName);            
        }

    }
}

In the preceding code, there is a lot of blob access code (which could be refactored). This function could however be greatly simplified by the use of one of the built-in binding expression tokens. Binding expression tokens can be used in binding expressions and are specified inside a pair of curly braces {…}. The {queueTrigger} binding token will extract the content of the incoming queue message that triggered a function.

For example, the code could be refactored as follows:

using System.IO;
using Microsoft.Azure.WebJobs;

namespace FunctionApp1
{
    public static class ConvertNameCase
    {
        [FunctionName("ConvertNameCase")]
        public static void Run(
        [QueueTrigger("capitalize-names")]string inputBlobPath,
        [Blob("names-in/{queueTrigger}", FileAccess.Read)] string originalName,
        [Blob("names-out/{queueTrigger}")] out string capitalizedName)
        {
                capitalizedName = originalName.ToUpperInvariant();         
        }
}

In the preceding code, the two [Blob] binding paths make use of the {queueTrigger} token. When the function is triggered, the queue message contains the name of the file to be processed. In the two [Blob] binding expressions, the {queueTrigger} token part will automatically be replaced with the text contents of the incoming message. For example if the message contained the text “File1.txt” then the two blob bindings would be set to names-in/File1.txt and names-out/File1.txt respectively. This means the input blob nameBlob string will automatically be read when the function is triggered,

To learn more about creating precompiled Azure Functions in Visual Studio, check out my Writing and Testing Precompiled Azure Functions in Visual Studio 2017 Pluralsight course.

Stack Overflow Developer Survey 2018 Overview for .NET Developers

$
0
0

The 2018 Stack Overflow Developer Survey was recently released.

This article summarizes some interesting points that .NET developers may find interesting, in additional to some other general items of potential  interest.

.NET Points of Interest

  • C# is the 8th most popular programming language among professional developers at 35.3%.
  • TypeScript is the 12th most popular language among profession developers at 18.3%
  • VB.NET is the 18th most popular programming language at 6.9%.
  • .NET Core is the 4th most popular framework among professional developers at 27.2%.
  • SQL Server is the 2nd most used database among professional developers  at 41.6%.
  • Windows Desktop or Server is the 2nd most developed-for platform among professional developers  at 35.2%.
  • Azure is the 10th most developed-for platform among professional developers at 11.4%.
  • TypeScript is the 4th most loved language at 67%.
  • C# is the 8th most loved language.
  • VB.NET is the 4th most dreaded language
  • .NET Core is the 5th most loved framework, the 8th most dreaded framework, and the 5th most wanted framework.
  • Azure is the 5th most loved database (Tables, CosmosDB, SQL, etc.)
  • SQL Server is the 10th most loved database.
  • Visual Studio Code and Visual Studio are the top two most popular development environments respectively (among all respondents).
  • Professional developers primarily use Windows (49.4%) as their development operating system.
  • F# is associated with the highest salary worldwide, with C# 16th highest.
  • .NET development technologies cluster around C#, Azure, .NET Core, SQL Server etc.

General Points of Interest

  • 52.7% of developers spend 9-12 hours per day on a computer.
  • 37.4% of developers don’t typically exercise.
  • 93.1% of professional developers identify as male.
  • 74.3% of professional developers identify as white or of European descent.
  • 85.9% of professional developers use Agile development methodologies.
  • 88.4% of professional developers use Git for version control.

The survey offers a wealth of additional information and you can find the full set of results over at Stack Overflow.


Testing Precompiled Azure Functions Overview

$
0
0

Just because serverless allows us to quickly deploy value, it doesn’t mean that testing is now obsolete. (click to Tweet)

If we’re using Azure Functions as our serverless platform we can write our code (for example C#) and test it before deploying to Azure. In this case we’re talking about precompiled Azure Functions as opposed to earlier incarnations of Azure Functions that used .csx script files.

Working with precompiled functions means the code can be developed and tested on a local development machine. The code we write is familiar C# with some additional attributes to integrate the code with the Azure Functions runtime.

Because the code is just regular C#, we can use familiar testing tools such as MSTest, xUnit.net, or NUnit. Using these familiar testing frameworks it’s possible to write tests that operate at different levels of granularity.

One way to categorize these tests are into:

  • Unit tests to check core business logic/value
  • Integration tests to check function run methods are operating correctly
  • End-to-end workflow tests that check multiple functions working together

To enable effective automated testing it may be necessary to write functions in such a way as to make them testable, for example by allowing function run method dependencies to be automatically injected at runtime, whereas at test time mock versions can be supplied for example using a framework such as AzureFunctions.Autofac.

There are other tools that allow us to more easily test functions locally such as the local functions runtime and the Azure storage emulator.

To learn more about using these tools and techniques to test Azure Functions, check out my Pluralsight course Testing Precompiled Azure Functions: Deep Dive.

Prevent Secrets From Accidentally Being Committed to Source Control in ASP.NET Core Apps

$
0
0

One problem when dealing with developer “secrets” in development is accidentally checking them into source control. These secrets could be connection strings to dev resources, user IDs, product keys, etc.

To help prevent this from accidentally happening, the secrets can be stored outside of the project tree/source control repository. This means that when the code is checked in, there will be no secrets in the repository.

Each developer will have their secrets stored outside of the project code. When the app is run, these secrets can be retrieved at runtime from outside the project structure.

One way to accomplish this in ASP.NET Core  projects is to make use of the Microsoft.Extensions.SecretManager.Tools NuGet package to allow use of the command line tool. (also if you are targeting .NET Core 1.x , install the Microsoft.Extensions.Configuration.UserSecrets NuGet package).

Setting Up User Secrets

After creating a new ASP.NET Core project, add a tools reference to the NuGet package to the project, this will add the following item in the project file:

<DotNetCliToolReference Include="Microsoft.Extensions.SecretManager.Tools" Version="2.0.0" />

Build the project and then right click the project and you will see a new item called “Manage User Secrets” as the following screenshot shows:

Managing user secrets in Visual Studio

Clicking menu item will open a secrets.json file and also add an element named UserSecretsId to the project file. The content of this element is a GUID, the GUID is arbitrary but should be unique for each and every project.

<UserSecretsId>c83d8f04-8dba-4be4-8635-b5364f54e444</UserSecretsId>

User secrets will be stored in the secrets.json file which will be in %APPDATA%\Microsoft\UserSecrets\<user_secrets_id>\secrets.json on Windows or ~/.microsoft/usersecrets/<user_secrets_id>/secrets.json on Linux and macOS. Notice these paths contain the user_secrets_id that matches the GUID in the project file. In this way each project has a separate set of user secrets.

The secrets.json file contains key value pairs.

Managing User Secrets

User secrets can be added by editing the json file or by using the command line (from the project directory).

To list user secrets type: dotnet user-secrets list At the moment his will return “No secrets configured for this application.”

To set (add) a secret: dotnet user-secrets set "Id" "42"

The secrets.json file now contains the following:

{
  "Id": "42"
}

Other dotnet user-secrets  commands include:

  • clear - Deletes all the application secrets
  • list - Lists all the application secrets
  • remove - Removes the specified user secret
  • set - Sets the user secret to the specified value

Accessing User Secrets in Code

To retrieve users secrets, in the startup class, access the item by key, for example:

public void ConfigureServices(IServiceCollection services)
{
    services.AddMvc();

    var secretId = Configuration["Id"]; // returns 42
}

One thing to bear in mind is that secrets are not encrypted in the secrets.json file, as the documentation states: “The Secret Manager tool doesn't encrypt the stored secrets and shouldn't be treated as a trusted store. It's for development purposes only. The keys and values are stored in a JSON configuration file in the user profile directory.” & “You can store and protect Azure test and production secrets with the Azure Key Vault configuration provider.”

There’s a lot more information in the documentation and if you plan to use this tool you should read through it.

MSTest V2

$
0
0

In the (relatively) distant past, MSTest was often used by organizations because it was provided by Microsoft “in the box” with Visual Studio/.NET. Because of this, some organizations trusted MSTest over open source testing frameworks such as NUnit. This was at a time when the .NET open source ecosystem was not as advanced as it is today and before Microsoft began open sourcing some of their own products.

Nowadays MSTest is cross-platform and open source and is known as MSTest V2, and as the documentation states: “is a fully supported, open source and cross-platform implementation of the MSTest test framework with which to write tests targeting .NET Framework, .NET Core and ASP.NET Core on Windows, Linux, and Mac.”.

MSTest V2 provides typical assert functionality such as asserting on the values of: strings, numbers, collections, thrown exceptions, etc. Also like other testing frameworks, MSTest V2 allows the customization of the test execution lifecycle such as the running of additional setup code before each test executes. The framework also allows the creation of data driven tests (a single test method executing  multiple times with different input test data) and the ability to extend the framework with custom asserts and custom test attributes.

You can find out more about MSTest V2 at the GitHub repository, the documentation, or check out my Pluralsight course: Automated Testing with MSTest V2.

Customizing C# Object Member Display During Debugging

$
0
0

In a previous post I wrote about Customising the Appearance of Debug Information in Visual Studio with the DebuggerDisplay Attribute. In addition to controlling the high level  debugger appearance of an object we can also exert a lot more control over how the object appears in the debugger by using the DebuggerTypeProxy attribute.

For example, suppose we have the following (somewhat arbitrary) class:

class DataTransfer
{
    public string Name { get; set; }
    public string ValueInHex { get; set; }
}

By default, in the debugger it would look like the following:

Default Debugger View

To customize the display of of the object members, the DebuggerTypeProxy attribute can be applied.

The first step is to create a class to act as a display proxy. This class takes the original object as part of the constructor and then exposes the custom view via public properties.

For example, suppose that we wanted a decimal display of the hex number that originally is stored in a string property in the original DataTransfer object:

class DataTransferDebugView
{
    private readonly DataTransfer _data;

    public DataTransferDebugView(DataTransfer data)
    {
        _data = data;
    }

    public string NameUpper => _data.Name.ToUpperInvariant();
    public string ValueDecimal
    {
        get
        {
            bool isValidHex = int.TryParse(_data.ValueInHex, System.Globalization.NumberStyles.HexNumber, null, out var value);

            if (isValidHex)
            {
                return value.ToString();
            }

            return "INVALID HEX STRING";
        }
    }
}

Once this view object is defined, it can be selected by decorating the DataTransfer class with the DebuggerTypeProxy attribute as follows:

[DebuggerTypeProxy(typeof(DataTransferDebugView))]
class DataTransfer
{
    public string Name { get; set; }
    public string ValueInHex { get; set; }
}

Now in the debugger, the following can be seen:

Custom debug view showing hex value as a decimal

Also notice in the preceding image, that the original object view is available by expanding the Raw View section.

To learn more about C# attributes and even how to create your own custom ones, check out my C# Attributes: Power and Flexibility for Your Code course at Pluralsight.

Testing for Thrown Exceptions in xUnit.net

$
0
0

When writing tests it is sometimes useful to check that the correct exceptions are thrown at the expected time.

When using xUnit.net there are a number of ways to accomplish this.

As an example consider the following simple class:

public class TemperatureSensor
{
    bool _isInitialized;

    public void Initialize()
    {
        // Initialize hardware interface
        _isInitialized = true;
    }

    public int ReadCurrentTemperature()
    {
        if (!_isInitialized)
        {
            throw new InvalidOperationException("Cannot read temperature before initializing.");
        }

        // Read hardware temp
        return 42; // Simulate for demo code purposes
    }        
}

The first test we could write against the preceding class is to check the “happy path”:

[Fact]
public void ReadTemperature()
{
    var sut = new TemperatureSensor();

    sut.Initialize();

    var temperature = sut.ReadCurrentTemperature();

    Assert.Equal(42, temperature);
}

Next a test could be written to check that if the temperature is read before initializing the sensor, an exception of type InvalidOperationException is thrown. To so this the xUnit.net Assert.Throws method can be used. When using this method the generic type parameter indicates the type of expected exception and the method parameter takes an action that should cause this exception to be thrown, for example:

[Fact]
public void ErrorIfReadingBeforeInitialized()
{
    var sut = new TemperatureSensor();

    Assert.Throws<InvalidOperationException>(() => sut.ReadCurrentTemperature());
}

In the preceding test, if an InvalidOperationException is not thrown when the ReadCurrentTemperature method is called the test will fail.

The thrown exception can also be captured in a variable to make further asserts against the exception property values, for example:

[Fact]
public void ErrorIfReadingBeforeInitialized_CaptureExDemo()
{
    var sut = new TemperatureSensor();

    var ex = Assert.Throws<InvalidOperationException>(() => sut.ReadCurrentTemperature());

    Assert.Equal("Cannot read temperature before initializing.", ex.Message);
}

The Assert.Throws method expects the exact type of exception and not derived exceptions. In the case where you want to also allow derived exceptions, the Assert.ThrowsAny method can be used.

Similar exception testing features also exist in MSTest and NUnit frameworks.

To learn more about using exceptions to handle errors in C#, check out my Error Handling in C# with Exceptions Pluralsight course.

Lifelike Test Data Generation with Bogus

$
0
0

Bogus is a lovely library from Brian Chavez to use in automated tests to automatically generate test data of different kinds.

As an example suppose the following class is involved in a unit test:

public class Review
{
    public int Id { get; set; }
    public string Title { get; set; }
    public string Body { get; set; }
    public int Rating { get; set; }
    public DateTimeOffset Created { get; set; }

    public override string ToString()
    {
        return $"{Id} '{Title}'";
    }
}

In a test, a Review instance may need properties populating with values. This could be done manually, for example to check the ToString() implementation:

[Fact]
public void BeRepresentedAsAString()
{
    var sut = new Review
    {
        Id = 42,
        Title = "blah blah"
    };

    Assert.Equal("42 'blah blah'", sut.ToString());
}

Notice in the preceding test, the actual values and title don’t really matter, only the fact that they’re joined as part of the ToString() call. In this example the values for Id and Title could be considered anonymous variable / values in that we don’t really care about them.

The following test uses the Bogus NuGet package and uses its non-fluent facade syntax:

[Fact]
public void BeRepresentedAsAString_BogusFacadeSyntax()
{
    var faker = new Faker("en"); // default en

    var sut = new Review
    {
        Id = faker.Random.Number(),
        Title = faker.Random.String()
    };

    Assert.Equal($"{sut.Id} '{sut.Title}'", sut.ToString());
}

Bogus also has a powerful fluent syntax to define what a test object will look like. To use the fluent version, a Faker<T> instance is created where T is the test object to be configured and created, for example:

[Fact]
public void BeRepresentedAsAString_BogusFluentSyntax()
{
    var reviewFaker = new Faker<Review>()
        .RuleFor(x => x.Id, f => f.Random.Number(1, 10))
        .RuleFor(x => x.Title, f => f.Lorem.Sentence());

    var sut = reviewFaker.Generate(); 

    Assert.Equal($"{sut.Id} '{sut.Title}'", sut.ToString());
}

The first argument to the RuleFor() methods allows the property of the Review object to be selected and the second argument specifies how the property value should be generated. There is a huge range of test data types supported. In the preceding code the Random API is used as well as the Lorem API.

Some examples of the types of auto generated data include:

  • Addresses: ZipCode, City, Country, Latitude, etc.
  • Commerce: Department name, ProductName, ProductAdjective, Price, etc.
  • Company: CompanyName, CatchPhrase, Bs, etc.
  • Date: Past, Soon, Between, etc.
  • Finance: Account number, TransactionType, Currency, CreditCardNumber, etc.
  • Image URL: Random image, Animals image, Nature image, etc.
  • Internet: Email, DomainName, Ipv6, Password, etc.
  • Lorem: single word, Words, Sentence, Paragraphs, etc.
  • Name: FirstName, LastName, etc.
  • Rant: Random user review, etc.
  • System: FileName, MimeType, FileExt, etc.

Some of the random generated values are quite entertaining, for example Rant.Review() may produce "My co-worker Fate has one of these. He says it looks tall."; Company.Bs() may produce "transition cross-media users", and Company.CatchPhrase() may produce "Face to face object-oriented focus group".

Bogus configuration is quite powerful and allows fairly complex setup as the following code demonstrates:

[Fact]
public void CalculateAverageRatingWhenMultipleReviews()
{
    int rating = 0;

    var reviewFaker = new Faker<Review>()
        .RuleFor(x => x.Id, f => f.Random.Number(1, 10))
        .RuleFor(x => x.Rating, f => rating++);

    var productFaker = new Faker<Product>()
        .RuleFor(x => x.PricePerUnit, f => f.Finance.Amount())
        .RuleFor(x => x.Description, f => f.WaffleText(3))
        .FinishWith((f, x) =>
            {
                reviewFaker.Generate(3).ForEach(r => x.Reviews.Add(r));
            });

    var sut = productFaker.Generate();

    Assert.Equal(1, sut.AverageRating); // (0 + 1 + 2) / 3
}

The WaffleText() API is provided by one of the extensions to Bogus (WaffleGenerator.Bogus) that produces inane looking waffle text such as the following:

The Quality Of Hypothetical Aesthetic

"The parallel personal hardware cannot explain all the problems in maximizing the efficacy of any fundamental dichotomies of the logical psychic principle. Generally the requirements of unequivocal reciprocal individuality is strictly significant. On the other hand the characteristic organizational change reinforces the weaknesses in the evolution of metaphysical terminology over a given time limit. The objective of the explicit heuristic discordance is to delineate the truly global on-going flexibility or the preliminary qualification limit. A priority should be established based on a combination of functional baseline and inevitability of amelioration The Quality Of Hypothetical Aesthetic"

 - Michael Stringer in The Journal of the Proactive Directive Dichotomy (20174U)

structure plan.

To make the main points more explicit, it is fair to say that;
  * the value of the optical continuous reconstruction is reciprocated by what should be termed the sanctioned major issue.
  * The core drivers poses problems and challenges for both the heuristic non-referent spirituality and any discrete or Philosophical configuration mode.
  * an anticipation of the effects of any interpersonal fragmentation reinforces the weaknesses in the explicit deterministic service. This may be due to a lack of a doctrine of the interpersonal quality..
  * any significant enhancements in the strategic plan probably expresses the strategic personal theme. This trend may dissipate due to the personal milieu.

 firm assumptions about ideal major monologism evinces the universe of attitude.

The Flexible Implicit Aspiration.

Within current constraints on manpower resources, any consideration of the lessons learnt can fully utilize what should be termed the two-phase multi-media program.

For example, the assertion of the importance of the integration of doctrine of the prime remediation with strategic initiatives cannot be shown to be relevant. This is in contrast to the strategic fit.

To learn more about Bogus head over to the documentation.

Software Development Wheel

$
0
0

An example of a rendered software development wheel showing what areas are in need of attention.

In business/personal coaching there is an idea of a “wheel of life”. The idea is that you evaluate different dimensions of your life and plot them on a wheel. Ideally the wheel should be balanced (and as large) as possible. If the wheel is crooked, you know what areas need to be improved to create a better wheel that turns more easily. When the wheel turns more easily, you get better results.

In a similar vein, the Software Development Wheel allows you to evaluate your overall software development  experience.

The tool below can be used by individual developers, the software development team as a whole, or team leaders/managers.

To generate your own wheel fill out the form below.

Each dimension is rated from 1 (“very little or none”) to 5 (“awesome, doing everything possible”) and descriptions for the dimensions are included below.

Be sure to share this page with your fellow developers, team leaders, and management as it’s a great way to start a conversation about how to improve things.


Unit Testing C# File Access Code with System.IO.Abstractions

$
0
0

It can be difficult  to write unit tests for code that accesses the file system.

It’s possible to write integration tests that read in an actual file from the file system, do some processing, and check the resultant output file (or result) for correctness. There are a number of potential problems with these types of integration tests including the potential for them to more run slowly (real IO access overheads), additional test file management/setup code, etc. (this does not mean that some integration tests wouldn’t be useful however).

The System.IO.Abstractions NuGet package can help to make file access code more testable. This package provides a layer of abstraction over the file system that is API-compatible with existing code.

Take the following code as an example:

using System.IO;
namespace ConsoleApp1
{
    public class FileProcessorNotTestable
    {
        public void ConvertFirstLineToUpper(string inputFilePath)
        {
            string outputFilePath = Path.ChangeExtension(inputFilePath, ".out.txt");

            using (StreamReader inputReader = File.OpenText(inputFilePath))
            using (StreamWriter outputWriter = File.CreateText(outputFilePath))
            {
                bool isFirstLine = true;

                while (!inputReader.EndOfStream)
                {
                    string line = inputReader.ReadLine();

                    if (isFirstLine)
                    {
                        line = line.ToUpperInvariant();
                        isFirstLine = false;
                    }

                    outputWriter.WriteLine(line);
                }
            }
        }
    }
}

The preceding code opens a text file, and writes it to a new output file, but with the first line converted to uppercase.

This class is not easy to unit test however, it is tightly coupled to the physical file system with the calls to File.OpenText and File.CreateText.

Once the System.IO.Abstractions NuGet package is installed, the class can be refactored as follows:

using System.IO;
using System.IO.Abstractions;

namespace ConsoleApp1
{
    public class FileProcessorTestable
    {
        private readonly IFileSystem _fileSystem;

        public FileProcessorTestable() : this (new FileSystem()) {}

        public FileProcessorTestable(IFileSystem fileSystem)
        {
            _fileSystem = fileSystem;
        }

        public void ConvertFirstLineToUpper(string inputFilePath)
        {
            string outputFilePath = Path.ChangeExtension(inputFilePath, ".out.txt");

            using (StreamReader inputReader = _fileSystem.File.OpenText(inputFilePath))
            using (StreamWriter outputWriter = _fileSystem.File.CreateText(outputFilePath))
            {
                bool isFirstLine = true;

                while (!inputReader.EndOfStream)
                {
                    string line = inputReader.ReadLine();

                    if (isFirstLine)
                    {
                        line = line.ToUpperInvariant();
                        isFirstLine = false;
                    }

                    outputWriter.WriteLine(line);
                }
            }
        }
    }
}

The key things to notice in the preceding code is the ability to pass in an IFileSystem as a constructor parameter. The calls to File.OpenText and File.CreateText are now redirected to _fileSystem.File.OpenText and _fileSystem.File.CreateText  respectively.

If the parameterless constructor is used (e.g. in production at runtime) an instance of FileSystem will be used, however at test time, a mock IFileSystem can be supplied.

Handily, the System.IO.Abstractions.TestingHelpers NuGet package provides a pre-built mock file system that can be used in unit tests, as the following simple test demonstrates:

using System.IO.Abstractions.TestingHelpers;
using Xunit;

namespace XUnitTestProject1
{
    public class FileProcessorTestableShould
    {
        [Fact]
        public void ConvertFirstLine()
        {
            var mockFileSystem = new MockFileSystem();

            var mockInputFile = new MockFileData("line1\nline2\nline3");

            mockFileSystem.AddFile(@"C:\temp\in.txt", mockInputFile);

            var sut = new FileProcessorTestable(mockFileSystem);
            sut.ConvertFirstLineToUpper(@"C:\temp\in.txt");

            MockFileData mockOutputFile = mockFileSystem.GetFile(@"C:\temp\in.out.txt");

            string[] outputLines = mockOutputFile.TextContents.SplitLines();

            Assert.Equal("LINE1", outputLines[0]);
            Assert.Equal("line2", outputLines[1]);
            Assert.Equal("line3", outputLines[2]);
        }
    }
}

To see this in action or to learn more about file access, check out my Working with Files and Streams in C# Pluralsight course.

Setting Up Mock ref Return Values in Moq

$
0
0

I recently received a message related to my Mocking in .NET Core Unit Tests with Moq: Getting Started Pluralsight course asking how to set the values of ref parameters.

As a (somewhat contrived) example, consider the following code:

public interface IParser
{
    bool TryParse(string value, ref int output);
}

public class Thing
{
    private readonly IParser _parser;

    public Thing(IParser parser)
    {
        _parser = parser;
    }

    public string ConvertStringIntToHex(string number)
    {
        int i = 0;

        if (_parser.TryParse(number, ref i))
        {
            return i.ToString("X");
        }

        throw new ArgumentException("The value supplied cannot be parsed into an int.", nameof(number));
    }
}

The Thing class requires an IParser to be able to work. In a test, a mocked version of an IParser can be created by Moq as the following initial test demonstrates:

[Fact]
public void ReturnHex_Fail_NoSetup()
{
    var mockParser = new Mock<IParser>();

    var sut = new Thing(mockParser.Object);

    var result = sut.ConvertStringIntToHex("255"); // fails with ArgumentException

    Assert.Equal("FF", result);
}

The preceding test will fail however because the mocked TryParse has not been configured correctly, for example specifying that the method should return true.

The following modified test attempts to fix this:

[Fact]
public void ReturnHex_Fail_NoRefValueSetup()
{
    var mockParser = new Mock<IParser>();
    mockParser.Setup(x => x.TryParse(It.IsAny<string>(), ref It.Ref<int>.IsAny))
              .Returns(true);

    var sut = new Thing(mockParser.Object);

    var result = sut.ConvertStringIntToHex("255");

    Assert.Equal("FF", result); // Fails, actual result == 0
}

In the preceding code, the return value is being set, but nowhere is the ref int output“return value” being configured.

In the following test the Callback method is used to set the ref value. To be able to do this, a delegate must first be defined that matches the signature of the mocked method that contains the ref parameter. Once this delegate is defined it can be used in the Callback method as the following code demonstrates:

// Define a delegate that can be used to set the ref value in the mocked TryParse method 
delegate void MockTryParseCallback(string number, ref int output);

[Fact]
public void ReturnHex()
{
    var mockParser = new Mock<IParser>();
    mockParser.Setup(x => x.TryParse("255", ref It.Ref<int>.IsAny)) // When the TryParse method is called with 255
              .Callback(new MockTryParseCallback((string s, ref int output) => output = 255)) // Execute callback delegate and set the ref value
              .Returns(true); // Return true as the result of the TryParse method

    var sut = new Thing(mockParser.Object);

    var result = sut.ConvertStringIntToHex("255");

    Assert.Equal("FF", result);
}

If you’ve never used Moq or want to learn more about it check out the official Moq quickstart  or head over to my Pluralsight course.

Azure Functions Dependency Injection with Autofac

$
0
0

This post refers specifically to Azure Function V2.

If you want to write automated tests for Azure Functions methods and want to be able to control dependencies (e.g. to inject mock versions of things) you can set up dependency injection.

One way to do this is to install the AzureFunctions.Autofac NuGet package into your functions project.

Once installed, this package allows you to inject dependencies into your function methods at runtime.

Step 1: Create DI Mappings

The first step (after package installation) is to create a class that configures the dependencies. As an example suppose there was a function method that needed to make use of an implementation of an IInvestementAllocator. The following class can be added to the functions project:

using Autofac;
using AzureFunctions.Autofac.Configuration;

namespace InvestFunctionApp
{
    public class DIConfig
    {
        public DIConfig(string functionName)
        {
            DependencyInjection.Initialize(builder =>
            {
                builder.RegisterType<NaiveInvestementAllocator>().As<IInvestementAllocator>(); // Naive

            }, functionName);
        }
    }
}

In the preceding code, a constructor is defined that receives the name of the function that’s being injected into. Inside the constructor, types can be registered for dependency injection. In the preceding code the IInvestementAllocator interface is being mapped to the concrete class NaiveInvestementAllocator.

Step 2: Decorate Function Method Parameters

Now the DI registrations have been configured, the registered types can be injected in function methods. To do this the [Inject] attribute is applied to one or more parameters as the following code demonstrates:

[FunctionName("CalculatePortfolioAllocation")]
public static void Run(
    [QueueTrigger("deposit-requests")]DepositRequest depositRequest,
    [Inject] IInvestementAllocator investementAllocator,
    ILogger log)
    {
        log.LogInformation($"C# Queue trigger function processed: {depositRequest}");

        InvestementAllocation r = investementAllocator.Calculate(depositRequest.Amount, depositRequest.Investor);
    }

Notice in the preceding code the [Inject] attribute is applied to the IInvestementAllocator investementAllocator parameter. This IInvestementAllocator is the same interface that was registered earlier in the DIConfig class.

Step 3: Select DI Configuration

The final step to make all this work is to add an attribute to the class that contains the function method (that uses [Inject]). The attribute used is the DependencyInjectionConfig attribute that takes the type containing the DI configuration as a parameter, for example: [DependencyInjectionConfig(typeof(DIConfig))]

The full function code is as follows:

using AzureFunctions.Autofac;
using Microsoft.Azure.WebJobs;
using Microsoft.Extensions.Logging;

namespace InvestFunctionApp
{
    [DependencyInjectionConfig(typeof(DIConfig))]
    public static class CalculatePortfolioAllocation
    {
        [FunctionName("CalculatePortfolioAllocation")]
        public static void Run(
            [QueueTrigger("deposit-requests")]DepositRequest depositRequest,
            [Inject] IInvestementAllocator investementAllocator,
            ILogger log)
        {
            log.LogInformation($"C# Queue trigger function processed: {depositRequest}");

            InvestementAllocation r = investementAllocator.Calculate(depositRequest.Amount, depositRequest.Investor);
        }
    }
}

At runtime, when the CalculatePortfolioAllocation runs, an instance of an NaiveInvestementAllocator will be supplied to the function.

The library also supports features such as named dependencies and multiple DI configurations, to read more check out GitHub.

Remote Debugging Azure Functions V2 "The breakpoint will not currently be hit. No symbols have been loaded for this document"

$
0
0

Sometimes it can be tricky to attach the Visual Studio debugger to a deployed Azure Functions app. For example if you use Cloud Explorer you can right click on the deployed Azure Function and choose Attach Debugger as the following screenshot shows:

Using Cloud Explorer to debug an Azure Function

While this may seem to work at first, you may experience a problem with your breakpoints actually being hit when function app code is executing with the message “The breakpoint will not currently be hit. No symbols have been loaded for this document”.

The breakpoint will not currently be hit. No symbols have been loaded for this document

As an alternative to attaching via Cloud Explorer, you can try the following approach:

1 Log in to Azure Portal and navigate to your deployed function app.

2 Download the publish profile

Downloading publish profile for Azure Function app

3 Open the downloaded file

Make a note of the userName, userPWD and destinationAppUrl values.

4 Attach Visual Studio Debugger to Azure Function App

  1. Make sure your function app project is open in Visual Studio
  2. Make sure that you have deployed a debug version of the function app to Azure
  3. On the Debug menu choose Attach to Process..
  4. For the Attach to value, click the Select.. button and un-tick everything except Managed (CoreCLR) code
  5. In the Connection target enter the  destinationAppUrl (without the preceding http) followed by :4022– for example: investfunctionappdemotest.azurewebsites.net:4022 – and hit Enter
  6. You should now see an Enter Your Credentials popup box, use the userName and userPWD from step 3and click Ok
  7. Wait a few seconds for Visual Studio to do its thing
  8. Click the w3wp.exe process and the click the Attach button (see screenshot below) Be patient after clicking as it may take quite a while for all the debug symbols to load.

Attaching to the Azure Functions w3wp.exe process

5 Set your breakpoints as desired

Breakpoint set in function run method

6 Invoke your function code and you should see your breakpoint hit

Once again this may take a while so be patient, you may also see “a remote operation is taking longer than expected”.

Azure Function breakpoint being hit in Visual Studio

Mocking HttpRequest Body Content When Testing Azure Function HTTP Trigger Functions

$
0
0

When creating Azure Functions that are triggered by an HTTP request, you may want to write unit tests for the function Run method. These unit tests can be executed outside of the Azure Functions runtime, just like testing regular methods.

If your HTTP-triggered function takes as the function input an HttpRequest (as opposed to an automatically JSON-deserialized class) you may need to provide request data in your test.

As an example, consider the following code snippet that defines an HTTP-triggered function.

[FunctionName("Portfolio")]
[return: Queue("deposit-requests")]
public static async Task<DepositRequest> Run(
    [HttpTrigger(AuthorizationLevel.Function, "post", Route = "portfolio/{investorId}")]HttpRequest req,
    [Table("Portfolio", InvestorType.Individual, "{investorId}")] Investor investor,
    string investorId,
    ILogger log)
{
    log.LogInformation($"C# HTTP trigger function processed a request.");

    string requestBody = await new StreamReader(req.Body).ReadToEndAsync();

    log.LogInformation($"Request body: {requestBody}");

    var deposit = JsonConvert.DeserializeObject<Deposit>(requestBody);

    // etc.
}

If the  the preceding code is executed in a test, some content needs to be provided to be used when accessing req.Body. To do this using Moq a mock HttpRequest can be created that returns a specified Stream instance for req.Body.

If you want to create a request body that contains a JSON payload, you can use the following helper method in your tests:

private static Mock<HttpRequest> CreateMockRequest(object body)
{            
    var ms = new MemoryStream();
    var sw = new StreamWriter(ms);

    var json = JsonConvert.SerializeObject(body);

    sw.Write(json);
    sw.Flush();

    ms.Position = 0;

    var mockRequest = new Mock<HttpRequest>();
    mockRequest.Setup(x => x.Body).Returns(ms);

    return mockRequest;
}

As an example of using this method in a test:

[Fact]
public async Task ReturnCorrectDepositInformation()
{
    var deposit = new Deposit { Amount = 42 };
    var investor = new Investor { };

    Mock<HttpRequest> mockRequest = CreateMockRequest(deposit);

    DepositRequest result = await Portfolio.Run(mockRequest.Object, investor, "42", new Mock<ILogger>().Object);

    Assert.Equal(42, result.Amount);
    Assert.Same(investor, result.Investor);
}

When the preceding test is run, the function run method will get the contents of the memory stream that contains the JSON.

To learn more about using Moq to create/configure/use mock objects check out my Mocking in .NET Core Unit Tests with Moq: Getting Started Pluralsight course. or to learn more about MemoryStream and how to work with streams in C# check out my Working with Files and Streams course.

Viewing all 207 articles
Browse latest View live