Developing Core Services
In this tutorial we get to experience how a backend developer could use Codezero when developing a Core Service. In the Sample Project, the backend includes three services that handle frontend requests: Core, Leaf, and Socket services.
Traditionally the backend developer would need to run all (or many) of an application's microservices locally in order to work on one of the backend services.
In this tutorial we will show two scenarios: serving a Leaf service, and consuming and serving services.
-
Leaf service: Here, the developer runs only a Leaf service which will serve requests to a cluster-based Leaf service. No other services will be run locally.
-
Core service: Here, the developer runs only a Core service which will serve requests to a cluster-based Core service, but it will need to consume a Leaf service in the cloud. No other services will be run locally.
Objectives
In this tutorial, you will learn:
- How to develop a Leaf Service locally so that services in the cluster talk to the local service.
- How to develop a Core Service locally while it accesses services in the cluster and talks to other cluster-based services.
- How to collaborate with a frontend developer on another machine.
Prerequisites
It is assumed you have the standard prerequisites:
- A Kubernetes cluster
- The Codezero CLI installed
- The sample project cloned locally
The tutorial assumes you are at the root of the Sample Project repo, have completed the Sample Project tutorial, and that you have the sample project running in your cluster in the namespace sample-project
.
Leaf Tutorial
Run the Leaf Service Locally
Let's start by running the backend Leaf service locally. In the sample project root run:
yarn start-leaf
You should be able to access the local Leaf service at http://localhost:3010/api
. You should see an API response that looks something like this:
{"who": "leaf", "where": "Somewhere-Machine.local"}
We are now set up to use Codezero's Serve with the Leaf service. A locally running service will soon be accepting requests from the cluster.
Start the Codezero Daemon
Assuming you have already installed the CLI, run:
czctl start
To verify that the daemon is running:
czctl status
You should see a message that the Codezero Daemon is running. Reminder that you can run czctl help
if you ever get stuck.
Serving Leaf Services
To serve the sample-project-leaf service to the cluster, navigate to the URL of your frontend in the cluster. If you have used a load balancer, you can get the URL to the frontend via a kubectl
command:
kubectl get svc -n sample-project sample-project-frontend --output jsonpath='{.status.loadBalancer.ingress[0].ip}'
or (for zsh, make sure you remove the escape character before the open parenthesis)
open http://$(kubectl get svc -n sample-project sample-project-frontend --output jsonpath='{.status.loadBalancer.ingress[0].ip}')
You will see that the response of the Leaf service is from the cluster-based service:
{
"url": "http://sample-project-leaf:3010/api",
"who": "leaf",
"where": "sample-project-leaf-6b6f85dfb5-4z9jv",
"propagated-headers": "{\"x-c6o-intercept\":\"yes\"}"
}
You can now serve your local sample-project-leaf to the cluster using the following command:
czctl serve sample-project/sample-project-leaf 3010
Add a header (using something like the ModHeader Chrome extension) to the frontend page request x-c6o-userID:YOUR_USER_ID
. Replace YOUR_USER_ID with the user ID that you see in czctl status
and refresh the frontend page. You will now see a response from your local system:
{
"url": "http://sample-project-leaf:3010/api",
"who": "leaf",
"where": "Local-Machine.local",
"propagated-headers": "{\"x-c6o-userid\":\"YOUR_USER_ID\"}"
}
The value of "where" above will be the name of your local machine.
We use an extension to add headers to the request. For Chrome, the extension ModHeader
works well.
You can debug your code by attaching the debugger to port 9339 and this will allow you to set breakpoints in the Leaf service in the file ./packages/leaf/index.js
Core Tutorial
In this tutorial we will serve the "sample-project-core" service to the cluster while consuming other services from the "sample-project" namespace.
Setting up Consume
Let's start by running the backend Core service locally. In the sample project root run:
yarn start-core
Now make sure it's running correctly by opening the local URL in a browser:
open http://localhost:3001/api
You will see the following errors in the output without Consume:
{
"who":"core",
"where":"MacBook-Pro.local",
"mongo": {
"url":"mongodb://sample-project-database:27017/sample-project-database",
"error":"MongoNetworkError"},
"leaf": {
"error":"getaddrinfo ENOTFOUND sample-project-leaf"
},
"file":{
"path":"./data/message.txt",
"data":"99 bugs in the code<br />\n99 bugs in the code<br />\nclone the repo, patched it around<br />\n129 bugs in the code<br />"
}
}
}
}
Consuming With Core Services
Unlike the Leaf service, the Core service will need to talk to other services in the cloud and as a result will need to Consume. However, when Consume is active, the services in the cloud will prevent the local system from starting services on the same ports. Since the local Core service will need to listen to port 3001, we will exclude it from Consume.
czctl consume edit
Enter the following, save and exit the editor.
sample-project/*
!sample-project/sample-project-core
When using Consume with a locally served service, we need to exclude the served service.
Now your local Core service can talk to the cluster Leaf service at sample-project-leaf, and you should see this output:
{
"who":"core",
"where":"MacBook-Pro.local",
"mongo": {
"url":"mongodb://sample-project-database:27017/sample-project-database",
"success":true
},
"leaf": {
"url":"http://sample-project-leaf:3010/api",
"who":"leaf",
"where":"sample-project-leaf-6b6f85dfb5-4z9jv",
"propagated-headers":"{\"x-c6o-intercept\":\"yes\"}",
"file": {
"path":"./data/message.txt",
"data":"99 bugs in the code<br />\n99 bugs in the code<br />\nclone the repo, patched it around<br />\n129 bugs in the code<br />"
}
}
}
}
Serving Core Services
Now, as with Leaf services, we can serve the Core service so that the frontend service talks to our locally running Core service.
czctl serve sample-project/sample-project-core 3001:3000
You can now navigate to the frontend in the cloud:
(for zsh, make sure you remove the escape character before the open parenthesis before kubectl
below)
open http://$(kubectl get svc -n sample-project sample-project-frontend --output jsonpath='{.status.loadBalancer.ingress[0].ip}')
You will see that the core output comes from your locally running service with the response:
{
"url": "/api",
"who": "core",
"where": "Local-Machine.local"
}
As before, you will still be able to set breakpoints in your locally running Core service.
Collaboration with Serve and Consume
For the following, we are assuming the backend developer has started their Core service running from the previous section on Core services.
Now, a frontend developer on another machine can consume services in the cluster and access your locally running Core service.
On another machine, consume the sample-project and run the frontend locally.
czctl consume edit
yarn start-frontend
Navigate to http://localhost:3030?t=1
with the header flags x-c6o-userid:YOUR_USER_ID
.
open http://localhost:3030?t=1
They should see the output from your locally running Core service where you can set breakpoints.
{
"url": "/api",
"who": "core",
"where": "Local-Machine.local"
}
Cleanup
When you are done, you can stop all sessions by stopping the Codezero Daemon:
czctl stop