SERVICES

 

Cloud Native

Grand Cloud believes in a Cloud Native approach, developing applications with a Cloud-First mentality. A Cloud Native Architecture is one that has global scale and strong consistency and communicates over a Service Mesh.  Our expertise with Cloud Native includes a variety of tools such as Kubernetes, Istio, Vault, Consul, and Fauna.

Data Pipelines

Data pipelines use all of the data flowing through an enterprise to make real time and mission critical decisions to drive business value.  We have experience building these type of pipelines with Kafka, Spark, Flink, Cassandra, and other tools.

DevOps

DevOps combines the roles of software development and operations. Grand Cloud’s DevOps engineers have deep expertise in tools such as Sentinel, Terraform, and Ansible for building the infrastructure on AWS and Google Cloud.

Enterprise & Web Development

Grand Cloud teams have a proven track record of building software solutions across dozens of business domains. Our engineers can handle the full life cycle of development or augment your existing teams. We have broad platform experience in technologies like Java, Scala, Akka, Python, Node.JS, JavaScript, Angular, React/Redux, and many more.

The TEam

Ryan Knight

Chief Executive Officer

Ryan is a technical thought leader with ...

Aleza Leinwand 

Senior UX Designer

Aleza is a user experience designer who ...

Henrik Engström

Chief Technical Officer

Henrik has over 20 years of professional...

Benjamin Edwards

Senior Consultant

Benjamin's background and interest is in...

 

TRAININGS

We offer the following trainings:

Building Cloud Native Applications with gRPC, Kubernetes and Istio

2 DAY COURSE

In this advance two day hands-on class your team will learn how to take a Cloud Native Application from inception to production. Starting with a base sample application we will learn how to break the application into separate services that communicate via gRPC. We will then learn how to take that application into production using Kubernetes. Next we will look at the challenges of reliable service communication in a complex topology of services. The last part of the workshop will layer in Istio to create a service mesh for advance security, traffic management and telemetry.

Course offered in both Java and Scala.

Some of the topics to be covered include:

  • Service communication using Protobuf 3 and gRPC

  • Deploying and Managing Service using Kubernetes

  • Building a Service Mesh with Istio

  • Releasing new services with Canary deployments

  • Using Istio to create reliable service to service communication

  • Advanced usages of Istio for traffic management and secure communication

  • In-depth observability using telemetry and distributed tracing

FaunaDB Developer Training

1 DAY COURSE

This course is designed for a development team just getting started with FaunaDB.  The hands on exercises will help the team gain an in-depth understanding of the FaunaDB Functional Query paradigms and how they encapsulate what is similar to traditional ACID Transaction.  

Course offered in both Java and Scala

Some of the topics covered include:

  • Fundamentals of the Distributed Transaction Engine

  • Achieving Consistency at a Global Scale

  • Schema Design

  • Creating Databases, Classes, Indexes and Instances

  • Security, Identity and Isolation

  • Scheduling with the Quality of Service Manager

Reactive Architecture with FaunaDB

1 DAY COURSE

In this advance one day class we will look at combining the core concepts of FaunaDB such as functional queries and ACID transactions with core reactive principles to create a cloud native application.   We will use these principles to learn how to enforce complex business constraints that span multiple services.  In addition we will look at how we can architect an application that combines both global scale and strong consistency.

Course offered in both Java and Scala

Large Scale Data Pipelines with Spark, Kafka and Cassandra

1 DAY COURSE

​In this one day class we will walk through building a large scale, mission critical data pipeline using Kafka, Spark Streaming and Cassandra. The workshop will start by looking at the individual technologies that make up the data pipeline. Then we will discuss the overall architecture of the data pipeline and how to address these core principles. Attendees will get hands on with exercises that walk through each of the pieces of the architecture.

 

CONTACT

For all inquiries, please contact us at info@grandcloud.com

© 2020 Grand Cloud, LLC