Blog
/
/
Internal Developer Portals: self-service actions using infrastructure as code and GitOps
Internal developer portal

Internal Developer Portals: self-service actions using infrastructure as code and GitOps

Mor Paz
May 14, 2023
Sign up
Download PDF
Internal developer portal

GitOps and the developer experience 

Why is the intersection of GitOps, infrastructure as code and the developer experience difficult? 

Truth is that GitOps can be pretty daunting for developers. From the developer point of view, it requires a deep understanding and familiarity with DevOps practices. From the DevOps point of view, it requires a high degree of trust because whatever is committed into the git repository and then exists on the main branch becomes the actual architecture. 

As a result, every change needs to come through a pull request, and that pull request needs to be reviewed, and all the right people need to have the right permissions to open those pull requests and to perform these reviews. There is no way around it, because anything that goes inside the main branch could have undesirable effects. GitOps solves some of this because it gives us that one source of truth for our code and for our configuration. But that doesn’t mean that we don’t need to handle it properly and carefully.

This is also why developers are less comfortable dealing with GitOps. Internal developer portals and the platform engineering movement solve this by offering developer self-service for these types of actions, with the right guardrails, golden paths, manual reviews and RBAC in place. This lowers the bar for developers and makes GitOps less intimidating. This doesn’t mean that GitOps and internal developer portals are two opposed ideas. They actually go hand in hand. The existence of an internal developer portal doesn’t preempt developers from using GitOps. When developers perform self-service actions using the internal developer portal, they can be self-sufficient, and the benefits of using GitOps under the hood remain.

Infrastructure as Code and the developer experience

Infrastructure as Code is another area where the developer experience is far from optimal. Think about Terraform or Pulumi which require learning a new syntax, becoming familiar with a new library etc. The real issue is that, as we discussed above with GitOps, every line you write in Infrastructure as Code actually makes a change in the underlying infrastructure. When a developer writes infrastructure as code they need to actually understand what they are doing, requiring a deep dive into the infrastructure layer. 

Developers don’t like doing this and this takes them away from the flow they were already in. It’s also a practice that can easily create new security vulnerabilities or bad practices. It would be better if developers had access to a golden path to do that instead of worrying about the wrong resource or variable or some omission in the file. This is what internal developer portals do.

{{cta_4}}

How self-service actions in the internal developer portal deal with GitOps and IaC

Let’s see how we can make use of developer self-service actions to let developers gain independence while ensuring those infrastructure changes are done in a secure, standardized and compliant way. 

The first step is to identify the self-service actions you want to allow. Given that you are already using infrastructure as code you probably have an existing Terraform module or Terraform resource definition, and you’ve defined it in a way that is dynamic and configurable. What would usually happen is that when a developer comes to DevOps to request anything from a new documentDB to a new S3 bucket, you will use the Terraform module for that with the right variables. 

How can we let developers do this without involving the DevOps team? The answer is to expose this self-service action to the developer simply by making the resource definition more parameterized and providing the developer with a way to tell which parameters they need. These parameters will just be injected into the file and we’ll still be compliant and secure because the logic would make sure that parameters are injected into the file in the correct manner.

Here’s a diagram of the process:

1. The developer executes a self-service action in Port’s user interface (UI/API/CLI/ChatOps). 

2. This action will trigger a webhook event that will be sent to a Lambda function. The Lambda function will deploy the developer values in the infrastructure and generate new infrastructure as code files, based on the template we pre-set.

        a. The Lambda function is developed by the DevOps team to handle such requests from Port’s API, the developer does not need to know how to maintain it or update it.

        b. In addition to triggering a Lambda function Port supports a variety of backends such as sending Webhooks to specified URLs, triggering Jenkins or Azure pipelines, directly triggering a GitHub Workflow or sending a payload to a dedicated Kafka topic

3. This will then open a PR in GitHub and run Terraform plan to simulate the changes that will be made to the infrastructure by these new files. 

4. Once we merge this PR into our main branch, a CI process will run and apply those changes to our AWS infrastructure.

At the center of any internal developer portal, there is a software catalog. The software catalog acts as a single source of truth with regards to services, CI/CD, cloud resources and anything you’ve added to it. To make it valuable, it needs to always be up to date. This is why after any developer self-service action, the software catalog updates with the new changes, in this case, changes to the AWS infrastructure. This allows the software catalog to serve as a source of truth for workflow automation.

Every self-service action in Port has a run log, and since many self-service actions take time to run (the actions are long running and asynchronous), reflecting action progress in the run log makes life easier for the developer because you just expose the logs they need to make sense of the action they requested or get any pertinent information required to resolve issues. 

How to enable GitOps with IaC under the hood in Port

We begin with defining blueprints that will support our use case. In this case, two blueprints: Service and Database.

A Blueprint, or custom entity definition, is a data model that allows definition of the metadata associated with software catalog entities. Blueprints are the main building block of Port. 

Blueprints support the representation of any asset in Port, such as microservice, environment, package, cluster, databases etc. Since you can create any blueprint (for alerts, for CI/CD etc) you can support any data model (e.g. package management, alert management, finops). Port internal developer portal templates are sets of pre-defined blueprints that work well with the relevant Port integration. 

In this case, the Service blueprint represents microservices. The Database blueprint models actual database resources.

Self-service actions are also defined in blueprints. In this case, we’ll add an ‘add documentDB’ action in the Service blueprint. The action is configured in a JSON that lets you configure all the user inputs you want your developers to fill when they trigger this self-service action.  This is also where you define guardrails. For example in the case of a new documentDB, we can decide that the number of instances a developer can request can be anything between one and five, but not more than that. 

Self-service actions from the point of view of the developer using the developer portal

Let’s examine this from the point of view of the developer. The developer goes to the service page in the software catalog and wants to add a new documentDB to the mailer service. Once the form is filled in, the action will begin to execute. 

To follow the entire process, I suggest you view this video:

Two comments about the developer experience:

Run logs: In Port, run logs show the developer that the status of the action is “waiting for approval”. Members of the DevOps team will get the request for manual approval (since this was the rule defined for this type of request).  Once the approval is granted, the webhook is triggered. This is important since many actions either run for a long time or are asynchronous.

Day-2 actions: developers don’t only create new infrastructure, they also need to edit infrastructure, or use day-2 operations. In Port, this is done in the same way as we’ve shown above. In this case, we may want to increase the instance count to four, and the same process will flow from here.

{{cta_1}}

Check out Port's pre-populated demo and see what it's all about.

Check live demo

No email required

{{cta_2}}

Contact sales for a technical product walkthrough

Let’s start
{{cta_3}}

Open a free Port account. No credit card required

Let’s start
{{cta_4}}

Watch Port live coding videos - setting up an internal developer portal & platform

Let’s start
{{cta_5}}

Check out Port's pre-populated demo and see what it's all about.

(no email required)

Let’s start
{{cta_6}}

Contact sales for a technical product walkthrough

Let’s start
{{cta_7}}

Open a free Port account. No credit card required

Let’s start
{{cta_8}}

Watch Port live coding videos - setting up an internal developer portal & platform

Let’s start
{{cta-demo}}

Example JSON block

{
  "foo": "bar"
}

Order Domain

{
  "properties": {},
  "relations": {},
  "title": "Orders",
  "identifier": "Orders"
}

Cart System

{
  "properties": {},
  "relations": {
    "domain": "Orders"
  },
  "identifier": "Cart",
  "title": "Cart"
}

Products System

{
  "properties": {},
  "relations": {
    "domain": "Orders"
  },
  "identifier": "Products",
  "title": "Products"
}

Cart Resource

{
  "properties": {
    "type": "postgress"
  },
  "relations": {},
  "icon": "GPU",
  "title": "Cart SQL database",
  "identifier": "cart-sql-sb"
}

Cart API

{
 "identifier": "CartAPI",
 "title": "Cart API",
 "blueprint": "API",
 "properties": {
   "type": "Open API"
 },
 "relations": {
   "provider": "CartService"
 },
 "icon": "Link"
}

Core Kafka Library

{
  "properties": {
    "type": "library"
  },
  "relations": {
    "system": "Cart"
  },
  "title": "Core Kafka Library",
  "identifier": "CoreKafkaLibrary"
}

Core Payment Library

{
  "properties": {
    "type": "library"
  },
  "relations": {
    "system": "Cart"
  },
  "title": "Core Payment Library",
  "identifier": "CorePaymentLibrary"
}

Cart Service JSON

{
 "identifier": "CartService",
 "title": "Cart Service",
 "blueprint": "Component",
 "properties": {
   "type": "service"
 },
 "relations": {
   "system": "Cart",
   "resources": [
     "cart-sql-sb"
   ],
   "consumesApi": [],
   "components": [
     "CorePaymentLibrary",
     "CoreKafkaLibrary"
   ]
 },
 "icon": "Cloud"
}

Products Service JSON

{
  "identifier": "ProductsService",
  "title": "Products Service",
  "blueprint": "Component",
  "properties": {
    "type": "service"
  },
  "relations": {
    "system": "Products",
    "consumesApi": [
      "CartAPI"
    ],
    "components": []
  }
}

Component Blueprint

{
 "identifier": "Component",
 "title": "Component",
 "icon": "Cloud",
 "schema": {
   "properties": {
     "type": {
       "enum": [
         "service",
         "library"
       ],
       "icon": "Docs",
       "type": "string",
       "enumColors": {
         "service": "blue",
         "library": "green"
       }
     }
   },
   "required": []
 },
 "mirrorProperties": {},
 "formulaProperties": {},
 "calculationProperties": {},
 "relations": {
   "system": {
     "target": "System",
     "required": false,
     "many": false
   },
   "resources": {
     "target": "Resource",
     "required": false,
     "many": true
   },
   "consumesApi": {
     "target": "API",
     "required": false,
     "many": true
   },
   "components": {
     "target": "Component",
     "required": false,
     "many": true
   },
   "providesApi": {
     "target": "API",
     "required": false,
     "many": false
   }
 }
}

Resource Blueprint

{
 “identifier”: “Resource”,
 “title”: “Resource”,
 “icon”: “DevopsTool”,
 “schema”: {
   “properties”: {
     “type”: {
       “enum”: [
         “postgress”,
         “kafka-topic”,
         “rabbit-queue”,
         “s3-bucket”
       ],
       “icon”: “Docs”,
       “type”: “string”
     }
   },
   “required”: []
 },
 “mirrorProperties”: {},
 “formulaProperties”: {},
 “calculationProperties”: {},
 “relations”: {}
}

API Blueprint

{
 "identifier": "API",
 "title": "API",
 "icon": "Link",
 "schema": {
   "properties": {
     "type": {
       "type": "string",
       "enum": [
         "Open API",
         "grpc"
       ]
     }
   },
   "required": []
 },
 "mirrorProperties": {},
 "formulaProperties": {},
 "calculationProperties": {},
 "relations": {
   "provider": {
     "target": "Component",
     "required": true,
     "many": false
   }
 }
}

Domain Blueprint

{
 "identifier": "Domain",
 "title": "Domain",
 "icon": "Server",
 "schema": {
   "properties": {},
   "required": []
 },
 "mirrorProperties": {},
 "formulaProperties": {},
 "calculationProperties": {},
 "relations": {}
}

System Blueprint

{
 "identifier": "System",
 "title": "System",
 "icon": "DevopsTool",
 "schema": {
   "properties": {},
   "required": []
 },
 "mirrorProperties": {},
 "formulaProperties": {},
 "calculationProperties": {},
 "relations": {
   "domain": {
     "target": "Domain",
     "required": true,
     "many": false
   }
 }
}
{{tabel-1}}

Microservices SDLC

  • Scaffold a new microservice

  • Deploy (canary or blue-green)

  • Feature flagging

  • Revert

  • Lock deployments

  • Add Secret

  • Force merge pull request (skip tests on crises)

  • Add environment variable to service

  • Add IaC to the service

  • Upgrade package version

Development environments

  • Spin up a developer environment for 5 days

  • ETL mock data to environment

  • Invite developer to the environment

  • Extend TTL by 3 days

Cloud resources

  • Provision a cloud resource

  • Modify a cloud resource

  • Get permissions to access cloud resource

SRE actions

  • Update pod count

  • Update auto-scaling group

  • Execute incident response runbook automation

Data Engineering

  • Add / Remove / Update Column to table

  • Run Airflow DAG

  • Duplicate table

Backoffice

  • Change customer configuration

  • Update customer software version

  • Upgrade - Downgrade plan tier

  • Create - Delete customer

Machine learning actions

  • Train model

  • Pre-process dataset

  • Deploy

  • A/B testing traffic route

  • Revert

  • Spin up remote Jupyter notebook

{{tabel-2}}

Engineering tools

  • Observability

  • Tasks management

  • CI/CD

  • On-Call management

  • Troubleshooting tools

  • DevSecOps

  • Runbooks

Infrastructure

  • Cloud Resources

  • K8S

  • Containers & Serverless

  • IaC

  • Databases

  • Environments

  • Regions

Software and more

  • Microservices

  • Docker Images

  • Docs

  • APIs

  • 3rd parties

  • Runbooks

  • Cron jobs

Let us walk you through the platform and catalog the assets of your choice.

I’m ready, let’s start