Continuous Integration at Ocean

Ocean delivers a solution architecture with multiple components. To facilitate that we use a classical DevOps approach where automation and quality are key

Aitor Argomániz
Ocean Protocol

--

Introduction

Ocean Protocol is a project with many moving pieces of very different types. All of them are open source projects and include:

Putting all those pieces together is not an easy thing. From a developer’s point of view, you need to have some stuff configured to enable the usual development. Also as you can imagine the testing and integration of all of this is something you shouldn’t do manually.

Adding blockchain technologies to the equation doesn’t make things easier. Because of these reasons, we have been focusing since day one on the proper setup to make our lives easier.

Continuous Integration and Delivery

At Ocean, all the compilation, testing, and delivery pipelines are in Travis: nothing new there. What is quite interesting is that we package and deliver all the software in different “flavors” using the Travis pipelines.

We deliver packages in Npm, Pypi, Maven and Docker formats

Depending on the type of the project, we generate:

At Ocean, we create Docker images of everything. All the Docker images are published automatically in the Ocean Protocol Docker Hub, so that during the release process all the Docker images are created and published.

The combination of Docker with Docker Compose provides us with the tools to set up environments easily. The best example of this can be found in our Barge repository.

Developer Friendly

Barge allows developers to set up a totally functional environment including the following components:

  • An Ethereum Parity client with the Ocean Protocol Smart Contracts (keeper-contracts project) deployed.
  • The Parity Secret Store connected to the Ethereum Parity client.
  • An instance of the Aquarius Metadata API.
  • An instance of Brizo. The micro-service run by Publishers providing services and access to the users of the network.
  • An instance of Pleuston, an example of a Marketplace using Squid to interact with the rest of the network.
Main components started when you run barge

So, what do you need to start implementing in your local environment a new Marketplace using the Ocean Protocol stack?

  1. Download Barge and run the Ocean stack in your local environment
  2. Take a look at Pleuston. You can use it as a reference on how to understand how to use the Squid Javascript library.

Are you a data scientist or data engineer? Do you want to get access to the data available in Ocean?

  1. Download Barge and run the Ocean stack in your local environment
  2. Review the data science tools and check the Squid Python or Squid Java libraries.

Continuous Integration and Delivery of Smart Contracts

A special case is our Solidity Smart Contracts part of the Keeper Contracts project. Dealing with Smart Contracts is sometimes complicated.

The Travis compilation for Keeper Contracts, a part of the usual execution tests, does some interesting things:

  • Creates and publishes the libraries in NPM, Pypi and Maven formats. It allows clients to integrate keeper contracts directly.
  • Creates and publishes the Docker Image of keeper contracts shipping the Solidity Smart Contracts
  • Generates the linting and security reports (Mythril and Securify) for the Smart Contracts
Travis compilations of Keeper Contracts

Integration Testing

With all the components in place, we decided to integrate Barge with the Travis compilation to get our integration tests.

So if you look at the Ocean Protocol projects that require to orchestrate multiple components, you can find in the travis.yaml files the following type of configuration:

before_install:  
- git clone https://github.com/oceanprotocol/barge
- cd barge
- export KEEPER_VERSION=v0.6.12
- bash -x start_ocean.sh --latest --no-pleuston --local-spree-node 2>&1 > start_ocean.log &

What this is doing is to start a complete and totally functional Ocean environment, including a Parity network where we deploy the Keeper Smart Contracts. It means that for integration testing we are not using Ganache, we validate the complete flows using the same components we deploy in the rest of the productive environments.

I hope this gave you some insights about the Continuous Integration and Delivery pipelines that we are using at Ocean, especially for the things related to Smart Contracts and how to integrate and test them automatically. Hopefully, you found some information useful to apply to your projects.

All this definition and configuration was made by the Ocean Engineering Team. When we started building at Ocean, we had the commitment of doing that in the best possible way. Was tricky at the beginning but they maintained that compromise and we worked hard to have the current setup we are having now.

Ready to dive in? — Check out the Ocean Documentation to get a sense of what we’ve been working on, and fire all your technical questions in our Gitter chatroom.

Follow Ocean Protocol on Twitter, Telegram, LinkedIn, Reddit, GitHub & Newsletter for project updates and announcements.

--

--