How to use RabbitMQ with a C# producer and a C# consumer

In this blog post, we will explore how RabbitMQ can be used to create a system with a C# producer and a C# consumer, and how the consumer can save messages into a PostgreSQL database using Entity Framework as the ORM. To make this even easier to set up, we will use Docker to run a local instance of RabbitMQ.

Suppose you’re working for a weather monitoring company that provides real-time weather data analysis for businesses. To do this, you need access to a high volume data source like the National Oceanic and Atmospheric Administration’s (NOAA) Global Forecast System (GFS). The GFS provides access to global weather data, which can be a massive amount of data to process in real-time. In this situation, a messaging service like RabbitMQ can help you manage this data effectively.

The producer will be written in C# and will connect to the GFS API to retrieve weather data. It will then send this data to RabbitMQ as messages. The consumer will also be written in C# and will receive these messages from RabbitMQ. It will then use Entity Framework as the ORM to save the messages into a PostgreSQL database.

To get started, we need to set up a local instance of RabbitMQ using Docker. First, we need to install Docker on our machine. Once Docker is installed, we can run the following command in the terminal to pull the latest version of the RabbitMQ image:

docker pull rabbitmq

Once the image is downloaded, we can run the following command to start a RabbitMQ container:

docker run -d --hostname my-rabbit --name some-rabbit -p 5672:5672 -p 15672:15672 rabbitmq:latest

This command starts a RabbitMQ container with the name some-rabbit and sets the hostname to my-rabbit. We also map the default RabbitMQ ports (5672 and 15672) to the corresponding ports on our machine so that we can access the RabbitMQ web interface.

Now that we have RabbitMQ up and running, we can proceed with the C# producer and consumer code. First, we need to establish a connection to the GFS API using a library like RestSharp. We can then create a RabbitMQ connection and channel using the RabbitMQ .NET client library. Once we have the channel, we can publish messages to RabbitMQ using the channel’s BasicPublish method. Here’s some sample code to give you an idea of what this might look like:

// Connect to the GFS APIvar client = new RestClient(""); var request = new RestRequest("gridpoints/MLB/25,69/forecast", Method.GET); var response = client.Execute(request); var data = response.Content; // Create a RabbitMQ connection and channelvar factory = new ConnectionFactory() { HostName = "localhost" }; using (var connection = factory.CreateConnection()) 	    using (var channel = connection.CreateModel()) { 		        // Declare the exchange and queue    		channel.ExchangeDeclare(exchange: "weather_data", type: ExchangeType.Fanout); 		        channel.QueueDeclare(queue: "weather_data_queue", durable: false, exclusive: false, autoDelete: false, arguments: null);         // Publish the message to RabbitMQ            var message = Encoding.UTF8.GetBytes(data); channel.BasicPublish(exchange: "weather_data", routingKey: "", basicProperties: null, body: message);         Console.WriteLine(" [x] Sent {0}", data);     }

The consumer can also be written in C# using the RabbitMQ .NET client library. It can receive messages from RabbitMQ using the channel’s BasicConsume method. Once it receives a message,it can use Entity Framework to save the message to the PostgreSQL database. Here’s an example of what the consumer code might look like:

// Create a RabbitMQ connection and channelvar factory = new ConnectionFactory() { HostName = "localhost" }; using (var connection = factory.CreateConnection())     using (var channel = connection.CreateModel()) {         // Declare the queue            channel.QueueDeclare(queue: "weather_data_queue", durable: false, exclusive: false, autoDelete: false, arguments: null);         // Create a consumer            var consumer = new EventingBasicConsumer(channel);         consumer.Received += (model, ea) => {             var body = ea.Body.ToArray();             var message = Encoding.UTF8.GetString(body);             Console.WriteLine(" [x] Received {0}", message);             // Save the message to the database using Entity Framework                    using (var dbContext = new WeatherDbContext()) {                 var weatherData = new WeatherData { Data = message }; dbContext.WeatherData.Add(weatherData);                 dbContext.SaveChanges();     	} 	}; // Start consuming messages from RabbitMQ    channel.BasicConsume(queue: "weather_data_queue", autoAck: true, consumer: consumer); Console.WriteLine(" Press [enter] to exit."); Console.ReadLine(); }

In this example, we used Docker to run a local instance of RabbitMQ on our machine. We then used C# to create a producer that connected to the GFS API and sent messages to RabbitMQ, and a consumer that received these messages from RabbitMQ and used Entity Framework to save them to a PostgreSQL database.

This system can be used to manage real-time data and process it effectively, making it a valuable tool for monitoring and analysis in a high transactional environment.

Develop .Net Core apps with a SQL Server database on a Mac

Since the release of .Net Core and the dream of cross-platform development became a reality I’ve been eager to switch my development machine from a Windows laptop to my Macbook Air. I’m happy to say that I’ve been running 99% on a Mac since July of 2022. The move has been easy and fantastic, aside from not being able to develop legacy .Net apps – which are not cross-platform friendly, I’ve been doing all my development on my Mac – I still use my Windows machine for old .Net code. Outside the obvious environment setup and framework installations 90% of the migration was super easy – most of my code is in Github, so a simple clone of each repository was all that was needed to get up and running.

The biggest lift was finding a solution for SQL Server. SQL Server is not available on MacOS, so I had to find way to develop apps with a SQL Server back end. My first solution was to put all SQL Server databases in the cloud on Azure or AWS databases. That is definitely a valid option, but I wanted to run locally so I started looking for alternatives to hosting my development databases in the cloud. That’s where Docker comes to the rescue.

It turns out that you can run SQL Server on a Mac pretty easily using a SQL Server Docker container and Azure Data Studio.

SQL Server Docker Container

Turns out you can run the latest version of SQL Server on a Mac using Docker. All that is needed is to set up a Docker container that is running SQL Server. To get started with SQL Server and Docker follow this tutorial: Quickstart: Run SQL Server Linux container images with Docker. After the tutorial you should be able have a running SQL Server. Out of the box this SQL Server instance won’t have any databases associated to it. So if you need to restore a database from an existing backup, follow this tutorial: Restore a SQL Server database in Docker.

SQL Server running on Mac

Azure Data Studio

Azure Data Studio is a lot like SQL Server Management Studio but without all of the SSMS features. It works great for development in that you can create databases, query, write stored procedures/views, etc. Essentially the majority of things you would do with Query Analyzer you can do with Azure Data Studio on a Mac.

Once I had the Docker container running with SQL Server I used Azure Data Studio to connect to the server instance on the container. The SQL Server instance is running at localhost:1433 so you can connect just like you would using Management Studio. From there you are able to use the program just like Query Analyzer.

From here you can run .Net Core apps that connect to your SQL Server running in Docker just like you would if you were running on a Windows machine.


There was an error running the selected code generator in .Net Core 5

I’m working on a new .Net Core 5 web app with user authentication where I need to customize some of the Identity account pages (Login, Register, Forgot Password, etc). Out of the box these pages are built into .Net Core so there’s nothing you need to do to use them. However, if you want to customize any of the account pages you’ll need to scaffold the source of those pages into your project.

To scaffold items in Visual Studio 2019 – (Version 16.8.4 as of today) right the project or parent folder then select “Add –> New Scaffolded Item”. This has worked for me in the past but recently in .Net Core 5 Visual Studio throws this error during the scaffolding process:

“There was an error running the selected code generator: ‘Package restored failed. Rolling back package changes for ‘Your App’.”

I found a workaround for this error by using the dotnet CLI outside of Visual Studio to execute the scaffolding tool. The following steps use the aspnet-codegenerator tool to scaffold the full Identity pages area into your .Net Core 5 app.

  1. Close Visual Studio.
  2. Open a command prompt and change directories to the project location.
  3. Make sure the aspnet-codegenerator tool is installed on your machine by executing this command:
    dotnet tool install -g dotnet-aspnet-codegenerator
  4. Add the Microsoft.VisualStudio.Web.CodeGeneration.Design package to the project if it does not already exist in your project:
    Install-Package Microsoft.VisualStudio.Web.CodeGeneration.Design
  5. Run the following command where MyApp.Models.ApplicationDbContext is the namespace to your DbContext:
    dotnet aspnet-codegenerator identity -dc MyApp.Models.ApplicationDbContext

If the command completed without errors that should have fixed the “There was an error running the selected code generator” issue and created the necessary Identity Pages under Areas/Identity/Pages.

MVC Scaffold Identity Pages

dotnet aspnet-codegenerator also has the ability to scaffold only specific files versus all the Identity files if you don’t need the full set by passing in the -files parameter followed by the files you want to create. (Thanks to Nick for giving me a heads up in the comments about this parameter).

dotnet aspnet-codegenerator identity -dc MyApp.Models.ApplicationDbContext –files “Account.Register;Account.Login;Account.Logout”