Creating Expedient Microservices in Rust and Diesel

Written by: Taylor Jones

Rust is one of the most talked about languages out there right now. While the language just hit 1.0 last May, it has quickly developed in stability and features. Because of this rapid but stable growth, Rust has really caught the attention of many in the programming world. However, what makes Rust so attractive? Is it worth our time and attention?

What makes Rust really exciting to me is that it's starting to become known as a C/C++ alternative. I personally never really cared much for the design of C and C++, but Rust seems to address a lot of the things I didn't enjoy about those languages. More importantly, its gotten me excited about low-level systems programming again!

As a Rubyist, I've been incredibly interested in projects such as Helix that allow developers to use Rust as a means to extend the functionality of Ruby libraries (instead of C). With ideas like Helix gestating and maturing, Rust may very well become an essential part of Ruby-based stacks in the near future.

There's a lot happening in the Rust community worth talking about. However, while I think the future of Rust is exciting, its important to look at how Rust is being utilized today.

I've written before on the idea that adding a fairly new language or framework to your stack can become a fairly large risk. Breaking changes tend to occur a lot more in younger languages, and Rust is no exception. While companies such as Dropbox and Mozilla have significant Rust-based projects in production, I still believe it's wise for us to take a smaller risk than they have.

This is where microservices have a strong advantage in your architecture. They enable you to take a risk on something new, like Rust, and implement it into your ecosystem at a fairly low cost. Again, this doesn't eliminate the risk and perils of using a younger language. However, it certainly enables you to take risks in smaller doses.

The idea that I want to explore is what it would be like to create a small microservice in Rust. How hard is it? Is it even possible? What tooling will I need? There's a healthy list of questions that comes with this idea.

So here's what we'll do: We're going to assume that we have a blog application with a basic API. Though it sounds kind of silly, we're going to create a Rust service that consumes this API and stores each blog post into a database. Assuming that we have a blog API project that looks something like this, let's leverage Rust to create something great!

Cultivating Rust

Before we start any of this, we need to figure out the best way to introduce Rust to our programming environment.

There are a couple of really easy ways to set up Rust. However, I personally use RustUp to handle my Rust versioning. It's much like Ruby's RVM or Node's NVM.

What's awesome about RustUp is its ability to switch between Rust's two release channels with ease. For those who don't know, Rust essentially has two instances of its language. One instance (or channel) is known as Stable, which contains stable features and functionality. The other is known as Nightly, which contains the edge features and functionality.

For what we're building today, I'll be using Nightly. This might seem like a weird choice since we're looking to create something that's more "stable." However, with Rust, it's a bit wiser to plan for the future since we know that many of these features will find their way into Stable Rust eventually.

To set this up in RustUp:

rustup install stable
rustup install nightly
rustup default nightly

This should install Nightly in our keychain and make it the default Rust source for any new projects we create. Don't fret if you want to use Stable. You'll be free to switch back to it whenever you want!

Now to create the Rust project through Cargo (Rust's native package manager):

cargo new rustweb

This should create a really simple Rust project in your current directory. If you crack open the project, you can see that it contains a file called Cargo.toml. This is where our dependencies for the file will be stored.

Cargo.toml is very similar to something we'd see in NPM's package.json or Rails' Gemfile. The file lists the project dependencies, and our wonderful tools help us figure out the rest.

Firing Up Rust With Diesel

An awesome group of software developers have recently created a Rust framework known as Diesel. Diesel is interesting because it takes a lot of its ORM design inspiration from Rails' Active Record. Because of my experience in Rails, I found Diesel really attractive and easy to use.

Diesel requires a bit of additional setup, but it allows us to query data against a database of our choice with relative ease. It'll also help give us a stable backbone to build our microservice. Let's take a look.

Setting up Diesel

We'll be following Diesel's published tutorial closely. However, we'll veer off its path about midway to create our service. If something I say confuses you, please refer to their guide for clarity.

To introduce Diesel and other dependencies to our Rust project, we'll need to update our Cargo.toml to show the following:

Cargo.toml

[dependencies]
diesel = "0.8.0"
diesel_codegen = { version = "0.8.0", features = ["postgres"] }
dotenv = "0.8.0"
hyper = { version = "0.9", default-features = false, features = ["security-framework"] }
rustc-serialize = "0.3.19"

To pull these libraries into our project, run cargo build. This will not only fetch libraries, but also build an instance of our app against them. This is especially helpful for determining what dependencies work and don't work against your codebase.

While we now have Diesel's library integrated into our Rust app, we still need a means to control it through the command line. To do this, we'll want to install diesel_cli by running: cargo install diesel_cli.

Next, we're going to tell Diesel where to look for our database. In our case, we're creating a microservice with its own new database. For our app, I recommend using Postgres. However, Diesel is designed to handle a variety of database options if you want something different.

We'll name our database: rustweb.

echo DATABASE_URL=postgres://username:password@localhost/rustweb > .env

To complete our setup, we're going to run diesel setup to finish the job. This should configure our project and environment to work properly with the Diesel framework. After a few simple commands, we're ready to start drawing up the foundation of our database.

Crafting our database

Since our app is a microservice, we're assuming that our project depends on some other kind of app. Whether that app be a giant monolith or a series of smaller microservices, we need to design this microservice so that it can translate and store data from other sources. Effective microservices are like interchangeable parts, and our app needs to reflect that modularity.

We're going to start this process by building out a posts table that will store the blog posts we're consuming via the external API. To build this in Diesel, we'll enter diesel migration generate create_posts.

The generator will create an up.sql and a down.sql file, which will help us build the database. I'm not sure how comfortable you might be with writing pure SQL, but Diesel is a great place to start learning.

In our up.sql migration, we're going to write the logic to create the actual posts table in our database. Given an abstract, Post model, it should probably look something like this.

Post
- id (int)
- title (string)
- body (text / string)

We'll write a SQL migration along the lines of:

up.sql

CREATE TABLE posts (
    id SERIAL PRIMARY KEY,
    title VARCHAR NOT NULL,
    body TEXT NOT NULL
)

Before we move on, let's talk about Rust's primitive data structures.

Rust uses lower-level data types, very similar to that of C and C++. This might seem a bit foreign for those who deal in something like Ruby or JavaScript. However, it certainly doesn't hurt to learn how to deal with different data primitives and structures. Rust gives you more control over the data you're handling. However, it does require a few more steps to properly handle everything.

Our down.sql is going to look a whole lot less intimidating because it has a very simple job: dropping the posts table.

down.sql

DROP TABLE posts

To actually create the tables in the database, we need to run diesel migration run.

That's it! We now have the database we want. Now, that we have a basic database structure, let's deal with the inner logic of our Diesel app.

Building out the library logic

Remember the rest of those files that were generated by Cargo? We're going to start with src/lib.rs. This is our library file that we're going to use to connect all the individual pieces together. Think of it as our main method for our whole app.

To start things off, we need to write something to connect us to the database that we just created. We'll also want to build our models and import our schema. We're going to do all of this in one fell swoop:

src/lib.rs

#![feature(proc_macro)]
#[macro_use] extern crate diesel;
#[macro_use] extern crate diesel_codegen;
extern crate dotenv;
pub mod schema;
pub mod models;
use diesel::prelude::*;
use diesel::pg::PgConnection;
use dotenv::dotenv;
use std::env;
use self::models::{Post, NewPost};
pub fn establish_connection() -> PgConnection {
    dotenv().ok();
    let database_url = env::var("DATABASE_URL")
        .expect("DATABASE_URL must be set");
    PgConnection::establish(&database_url)
        .expect(&format!("Error connecting to {}", database_url))
}
pub fn create_post(conn: &PgConnection, title: &str, body: &str) -> Post {
    use schema::posts;
    let new_post = NewPost {
        title: title,
        body: body,
    };
    diesel::insert(&new_post).into(posts::table)
        .get_result(conn)
        .expect("Error saving new post")
}

What we have so far is a whole bunch of header content and two functions. It seems like a lot of setup for just a few methods.

However, think of this as home base for our app. Everything that we're doing is going to run through this. It's essential because it ties this whole thing together.

It's worth pointing out how the establish_connection method just takes the database URL through the .env that we created and attempts to connect to Postgres with it. It's pretty simple!

If you look closely at the section that has:

pub mod schema;
pub mod models;

You begin to wonder where the schema and models sections actually are. Well, we're about to write them!

src/models.rs

use schema::posts;
#[derive(Queryable)]
pub struct Post {
    pub id: i32,
    pub title: String,
    pub body: String
}
#[derive(Insertable)]
#[table_name="posts"]
pub struct NewPost<'a> {
    pub title: &amp;'a str,
    pub body: &amp;'a str,
}

This should create a query-able Post object and an insertable NewPost object. We'll use these pieces of data in tandem to insert and query information against our database. It's a pretty straightforward file, overall. The same could be said of the schema.rs file:

src/schema.rs

infer_schema!("dotenv:DATABASE_URL");

All this file does is grab the database schema from the .env file you created earlier, containing your database URL. This gives us a crucial connection to interacting with our database. It's a short and sweet example of how simple Rust and Diesel are!

Going back into our lib.rs file, we have the create_post function.

pub fn create_post(conn: &amp;PgConnection, title: &amp;str, body: &amp;str) -> Post {
    use schema::posts;
    let new_post = NewPost {
        title: title,
        body: body,
    };
    diesel::insert(&amp;new_post).into(posts::table)
        .get_result(conn)
        .expect("Error saving new post")
}

create_post is interesting because it takes in a Postgres connection, title attribute, and body attribute and creates a NewPost object from it. We use our schema here to define where to insert the NewPost object as well.

Writing executable scripts

Now that we've created a really solid foundation, let's write the Rust code that interacts directly with our API.

The way the Rust library projects work is that we create executable files in the /bin/ folder. We then take the code pieces from our library and use them to perform tasks. To demonstrate, we're going to create the following:

get_posts_from_source.rs

extern crate rustweb;
extern crate diesel;
extern crate hyper;
extern crate rustc_serialize;
use self::rustweb::*;
use hyper::{Client};
use std::io::Read;
use rustc_serialize::json;
use std::env::args;
// Automatically generates traits to the struct
#[derive(RustcDecodable, RustcEncodable)]
pub struct PostSerializer {
    id: u8,
    title: String,
    body: String,
}
fn main() {
    let start_s = args().nth(1).expect("Please provide a min id");
    let start : i32 = match start_s.parse() {
        Ok(n) => {
           n
       },
       Err(_) => {
           println!("error: first argument not an integer");
           return;
       },
   };
    let stop_s = args().nth(2).expect("Please provide a max id");
    let stop : i32 = match stop_s.parse() {
        Ok(n) => {
           n
       },
       Err(_) => {
           println!("error: second argument not an integer");
           return;
       },
   };
    for x in start..stop {
        let url = format!("http://localhost:3000/api/v1/posts/{}", x);
        let response = get_content(&amp;url).unwrap();
        let decoded: PostSerializer = json::decode(&amp;response).unwrap();
        create_post_from_object(&amp;decoded);
    }
}
fn get_content(url: &amp;str) -> hyper::Result<String> {
    let client = Client::new();
    let mut response = try!(client.get(url).send());
    let mut buffer = String::new();
    try!(response.read_to_string(&amp;mut buffer));
    Ok(buffer)
}
fn create_post_from_object(post: &amp;PostSerializer) {
    let connection = establish_connection();
    println!("==========================================================");
    println!("Title: {}", post.title);
    println!("==========================================================\n");
    println!("{}\n", post.body);
    create_post(&amp;connection, &amp;post.title, &amp;post.body);
}

Whew. That's a lot to take in. Let's break it down into pieces starting with the header and serializer model:

extern crate rustweb;
extern crate diesel;
extern crate hyper;
extern crate rustc_serialize;
use self::rustweb::*;
use hyper::{Client};
use std::io::Read;
use rustc_serialize::json;
use std::env::args;
// Automatically generates traits to the struct
#[derive(RustcDecodable, RustcEncodable)]
pub struct PostSerializer {
    id: u8,
    title: String,
    body: String,
}

What's interesting about this bit of code is that we're doing a lot of the things that we've seen before. We're importing our project and Diesel. However, there are a few other folks joining the party now. This file is using Rust's Hyper library and Rust's Serializer functionality.

These additional libraries allow us to request and serialize a JSON response from the server. Hyper does the requesting. Serializer does the serialization. Simple enough.

Next, we're going to take a look at the meat of our file, get_content and create_post_from_object:

fn get_content(url: &amp;str) -> hyper::Result<String> {
    let client = Client::new();
    let mut response = try!(client.get(url).send());
    let mut buffer = String::new();
    try!(response.read_to_string(&amp;mut buffer));
    Ok(buffer)
}

get_content uses the Hyper library to pull data from our localhost URL. It reads the response and translates it into a String that Rust can understand. If all of this is successful, it will return a String JSON representation of an API Post object.

fn create_post_from_object(post: &amp;PostSerializer) {
    let connection = establish_connection();
    println!("==========================================================");
    println!("Title: {}", post.title);
    println!("==========================================================\n");
    println!("{}\n", post.body);
    create_post(&amp;connection, &amp;post.title, &amp;post.body);
}

create_post_from_object takes a serialized Post object and establishes a connection to our database. It then utilizes our create_post function to create the Post object in the database!

Finally, we're going to see how our main() method ties all of these functions together.

fn main() {
    let start_s = args().nth(1).expect("Please provide a min id");
    let start : i32 = match start_s.parse() {
        Ok(n) => {
           n
       },
       Err(_) => {
           println!("error: first argument not an integer");
           return;
       },
   };
    let stop_s = args().nth(2).expect("Please provide a max id");
    let stop : i32 = match stop_s.parse() {
        Ok(n) => {
           n
       },
       Err(_) => {
           println!("error: second argument not an integer");
           return;
       },
   };
    for x in start..stop {
        let url = format!("http://localhost:3000/api/v1/posts/{}", x);
        let response = get_content(&amp;url).unwrap();
        let decoded: PostSerializer = json::decode(&amp;response).unwrap();
        create_post_from_object(&amp;decoded);
    }
}

We're going to iterate through X times and create Post objects within a range of records in our API. We'll be utilizing the PostSerializer that we just created above to organize our data into something that Diesel can consume.

Assuming that we have 10 posts in our API database, we can run the script by typing the following into your console:

cargo run --bin get_posts_from_source 1 10

You should see our code in action, outputting the API post data and storing it to our rustweb database!

Let's decompress from all that we've gone through here.

Wrapping Up

We've gone through a lot of material, and it might feel overwhelming or overly complex. However, I encourage you to use this to help feel out whether writing a Rust microservice is right for you. Think of it as sketching out an idea, with the hope that it becomes something incredibly useful.

If you're interested in where I'll be taking this idea in the future, follow along with the project on GitHub! I'd love to see some input and advice on how I'm doing.

Rust can take a while to get the hang of. However, it's an incredibly beneficial language to explore and learn from. Maybe in your learning you might find something that benefits you and your organization for many years to come!

Want to test and deploy your microservices with Codeship Pro? Find out more here.

PS: If you liked this article you can also download it as a PDF eBook here: Breaking up your Monolith into Microservices or watch our re-run of our webinar: An Introduction to Building Your Apps with Microservices.

Stay up to date

We'll never share your email address and you can opt out at any time, we promise.