Full Example
Our previous examples have been... simple and do not even begin to show what opscotch can do.
Lets write a workflow that actually uses some real world-data.
Here is the use case:
We want to continually collect a city's temperature and send that as a metric.
We can use a service to get the weather data: https://openweathermap.org/ has an API that returns the current weather.
This example is a simplistic "real world" example to demonstrate the opscotch agent's mechanics. There is so much potential in what is possible - limited only by your imagination.
Getting started
You can follow along with this demonstration with your own opscotch agent.
However, if you're feeling overwhelmed, read this page to the end without running the agent yourself.
When you see how easy it is, follow along with your own opscotch agent.
Before you start, you'll need a few prerequisites:
- A Java runtime installed (to run the tester)
- The opscotch test runner (jar)
You should have completed the previous example and have been able to run an opscotch workflow on an agent. If you have not done that, please do it first.
-
We'll be working in your operating system's terminal, so you need to know what that is and how to use it.
-
You'll need a copy of the opscotch agent and configuration packager executables that is compatible with the operating system that you are using.
-
Create a directory somewhere to work in, and put all the files there, including the opscotch executables.
-
Ensure you've read the Basic Concepts, specifically the "moving parts".
-
You'll need an opscotch (trial) license key.
- Save all text files as UTF-8 with LF line feeds.
- Any file paths in text files need to either use
/instead of\or escape\like\\
File setup
In this example, you'll be working with files. For your reference, we'll outline the file structure now for you to refer back to:
working directory
|\_ configs (configs for deploying to the agent)
| \_ license.text (your license file)
| \_ agent.public.key (agent public key file)
| \_ weather (weather configs for deploying to the agent)
| \_ weather.bootstrap.json (weather bootstrap file)
| \_ weather.config.json (weather workflow configuration file)
| \_ weather.config (weather packaged workflow configuration file)
| \_ weather.package.config.json (weather packaging configuration file)
|\_ resources (common javascript files)
| \_ weather (weather specific javascript files)
| \_ weather-resultsprocessor.js (weather results processor)
| \_ weather-urlgenerator.js (weather url generator)
|\_ tests (tests folder)
| \_ weather (weather tests)
| \_ weather.test.json (the weather test file)
| \_ weather.bootstrap.json (the weather test bootstrap file)
| \_ weather.config.json (the weather test configuration file)
| \_ weather.response.json (the weather API response file)
\_ testrunner.config.json (configures the test runner)
Create a folder in the working directory called resources and then one called weather (i.e. resources/weather) - this will be where we store all the javascript files we create.
Create a folder in the working directory called tests and then one called weather (i.e. tests/weather) - this will be where we store the testing files we create.
- Linux/macOS
- Windows
mkdir -p configs/weather resources/weather tests/weather
mkdir configs && mkdir configs\weather && mkdir resources && mkdir resources\weather && mkdir tests && mkdir tests\weather
The Weather Data Collection Task
We'll be pulling our data from OpenWeatherMap. You can get an idea about that here, and you can get your free API key here.
For this example, we'll let you use our API key; however we can't guarantee it will work.
a83da2d308613877a88910dce9568dd5
The URL that we're going to use looks like this, and we'll want to provide the city name and the API key:
https://api.openweathermap.org/data/2.5/weather?units=metric&q={city name}&appid={API key}
And when called effectively returns:
{
"main": {
"temp": 21.55
},
"dt": 1560350645,
"name": "Wellington"
}
Our opscotch task will be as follows:
Every 10 minutes, do this:
- construct the URL to include the city name and the API key.
- execute a call to the URL.
- parse the response into a JSON object.
- send the temperature as a metric.
Approach
As this is a complete example, we're going to take a best-practice approach and do the following:
- Understand the HTTP call.
- Create a test.
- Create a workflow executed against the test.
- Create an agent-deployable workflow.
Understanding the HTTP call
The first task in creating a workflow is understanding what needs to be done; this often starts with understanding the HTTP call and the resulting data.
The URL that we're going to use looks like this:
https://api.openweathermap.org/data/2.5/weather?units=metric&q={city name}&appid={API key}
The best thing to do is to execute the HTTP call using your terminal and check out the results:
- Linux/macOS
- Windows
export OPENWEATHERMAP_KEY=a83da2d308613877a88910dce9568dd5
export CITY_NAME=Wellington
curl "https://api.openweathermap.org/data/2.5/weather?units=metric&q=$CITY_NAME&appid=$OPENWEATHERMAP_KEY"
Use Powershell
$env:OPENWEATHERMAP_KEY = 'a83da2d308613877a88910dce9568dd5'
$env:CITY_NAME = 'Wellington'
Invoke-WebRequest -Uri "https://api.openweathermap.org/data/2.5/weather?units=metric&q=$env:CITY_NAME&appid=$env:OPENWEATHERMAP_KEY"
You should get results like this:
{
"coord": {
"lon": 174.7756,
"lat": -41.2866
},
"weather": [
{
"id": 803,
"main": "Clouds",
"description": "broken clouds",
"icon": "04d"
}
],
"base": "stations",
"main": {
"temp": 12.53,
"feels_like": 11.95,
"temp_min": 12.53,
"temp_max": 15.51,
"pressure": 1016,
"humidity": 81
},
"visibility": 10000,
"wind": {
"speed": 10.29,
"deg": 340
},
"clouds": {
"all": 54
},
"dt": 1666737490,
"sys": {
"type": 2,
"id": 2007945,
"country": "NZ",
"sunrise": 1666718172,
"sunset": 1666767204
},
"timezone": 46800,
"id": 2179537,
"name": "Wellington",
"cod": 200
}
When we review out task: "collect the temperature from a city", we can reduce the JSON down to just what we need: the temperature and the timestamp:
{
"main": {
"temp": 12.53
},
"dt": 1666737490
}
Now that we have an understanding of the available data and how to get it, we can start with the tests.
Testing
Tests are essential: for every workflow you write, you should write a test.
A test means you can be confident that the opscotch agent executes consistently, especially important during workflow development and upgrades.
The Test Runner Configuration
Let's create the test runner configuration - this tells the test runner how to find our resources and tests: create a file in the working directory called testrunner.config.json with this content:
{
"resourceDirs": [
"resources"
],
"testDirs": [
"tests"
],
"license": "license.txt"
}
You'll need to replace the contents of the license file with your supplied license (i.e. paste in the contents of your license).
The Weather Test File
In the tests/weather directory, create a file called weather.test.json - this is where we set up our test for the weather workflow. Take a look at the test schema in the API reference.
Files named *.test.json are recognised and discovered by the test runner as test files.
Add this content:
{
"fromDirectory": "weather",
"useThisTestBootstrapFile": "weather.bootstrap.json",
"useThisTestConfigFile": "weather.config.json",
"testThisStep" : "current-temperature",
"mockEndpointResponses" : [
{
"whenThisIsCalled" : "http://mockserver/data/2.5/weather?units=metric&q=Wellington&appid=1234",
"returnThisFile" : "weather.response.json"
}
],
"theseMetricsShouldHaveBeenSent": [
{
"key" : "temperature",
"value" : 12.53,
"timestamp" : 1666737490
}
]
}
What we've setup so far:
fromDirectorytells the test runner the working directory for the test.useThisTestBootstrapFiletells the test what Bootstrap file to load - this will be specific for the test and wouldn't be deployed to an agent.useThisTestConfigFiletells the test what workflow file to load.testThisSteptells the test which step to execute.mockEndpointResponsestells the test how to respond to HTTP requests from the agent.theseMetricsShouldHaveBeenSenttells the test what metrics to expect the agent to send during the test.
These files don't exist yet; let's create them.
The weather.bootstrap.json
This Bootstrap construct offers us a few advantages:
- Defines allowed hosts -security comes first in the agent, and you're not allowed to call out to any old host.
- Defines allowed HTTP methods and paths.
- Abstracts the host details from the workflow configuration such that we can use the same configuration with different hosts.
- Enables testing the workflow configuration without actually using a real server.
weather.bootstrap.json Bootstrap file is the same as any other Bootstrap file and, in this case, will be specific for the test: since this is a test, we want to be in total control of what data the agent gets when it makes an API call. If we let the agent call out to the actual API, we'd get the constantly changing weather data which makes for ineffective testing. The way we get around this is to mock the API response for the test. We start this process with the test Bootstrap file, and instead of defining a real OpenWeatherMap API server, we'll define one called mockserver for the test to use.
Let's do that now: update the contents of tests/weather/weather.bootstrap.json to this:
{
"hosts" : {
"openweathermap" : {
"host" : "http://mockserver",
"allowlist" : [
["GET", "/data/2.5/weather"]
]
}
}
}
This defines:
- A host record called
openweathermapthat points to a hosthttp://mockserver(the test runner - this would behttps://api.openweathermap.orgin the actual Bootstrap file) - An
allowlistthat permits an HTTPGETcall on the path/data/2.5/weather
The weather.response.json
In the test configuration, we defined mockEndpointResponses to be the following:
{
"mockEndpointResponses" : [
{
"whenThisIsCalled" : "http://mockserver/data/2.5/weather?units=metric&q=Wellington&appid=1234",
"returnThisFile" : "weather.response.json"
}
]
}
When the agent calls http://mockserver/data/2.5/weather?units=metric&q=Wellington&appid=1234 during the test, the contents of weather.response.json will be returned. Since this is an array, we can define any number of these.
Notice that we refer to http://mockserver - which comes from the test Bootstrap.
Remember that we're pretending (mocking) to be the OpenWeatherMap API server - the contents of weather.response.json should be the contents of what the OpenWeatherMap API server would return. However, if we set the content to be the trimmed-down version (i.e. only what we want to use), then we're also giving a clue to anyone reading the tests about the data that we care about: so let's set the content of tests/weather/weather.response.json to this:
{
"main": {
"temp": 12.53
},
"dt": 1666737490
}
The Weather Metric
In the test configuration file, we defined this:
{
"theseMetricsShouldHaveBeenSent": [
{
"key" : "temperature",
"value" : 12.53,
"timestamp" : 1666737490
}
]
}
This declares that when the test runs, we expect the agent to send a metric with a key of temperature, a value of 12.53, and a timestamp of 1666737490 - thats from the dt field on the OpenWeatherMap API response.
If it does not happen the test will fail - its up to us to make this happen in the workflow!
You don't need to define the expected timestamp because sometimes you won't know - for the curious there are ways around this.