Skip to main content

Generative AI for Executives


  • Aws Advanced Training Partner

  • Aws Premium Consuting Partner

Generative AI for Executives

current course dates can be found at the bottom of this page … company training available on request!

Course description

In this course, you will learn how to leverage generative artificial intelligence (AI) within your organization. We’ll cover how to drive business value with generative AI, the use cases across various industries, and the considerations to implement generative AI safely and responsibly. The goal of this course is to provide you with the fundamental concepts and tools you’ll need to successfully lead generative AI initiatives within your organization.

Course objectives

In this course, you will learn to:

  • Recognize the potential business value of generative AI

  • Identify real world use cases that you can implement today

  • Manage the people, process, and technology changes needed to be successful

  • Use generative AI safely and responsibly

  • Identify the specific steps you can take to get started with generative AI

Intended audience

This course is intended for experienced:

  • Executives and Senior Business Leaders

Prerequisites

  • None

Activities

This course includes:

  • Presentations
  • Group discussions

Course duration / Price

  • 0,5 days
  • € 395.00 (excl. tax) per person

Course outline

Modul 1: Introduction to Generative AI

  • Definitions and terminology
  • AWS approach to generative AI

Modul 2: Generative AI Use Cases

  • Common use cases
  • Real-world case studies

Modul 3: Overcoming Technical and Organizational Challenges

  • Security
  • Accuracy
  • Cost
  • People and culture

Modul 4: Implementation

  • Identifying your use case
  • Assessing data, technology, people, and processes
  • Evaluating business impact and scaling

Modul 5: Next Steps in Your Generative AI Journey

  • Next steps and additional resources
  • Course summary

IMPORTANT: Please bring your notebook (Windows, Linux or Mac) to our trainings. If this is not possible, please contact us in advance.

Course materials are in English, on request also in German (if available).
Course language is German, on request also in English.



Neue Termine in Planung!

Continue reading

Designing and Implementing Storage on AWS


  • Aws Advanced Training Partner

  • Aws Premium Consuting Partner

Designing and Implementing Storage on AWS

current course dates can be found at the bottom of this page … company training available on request!

Course description

AWS offers a broad portfolio of storage services and solutions with diverse capabilities for storing, accessing, and protecting your data. In this course, you will learn where, how, and when to take advantage of these different service offerings. You will learn which services to consider when looking to solve your data storage challenges. You will learn how to best evaluate your options in selecting the appropriate AWS storage service to meet your use case and business requirements. You will also gain a better understanding of how to store, manage, and protect your data in the cloud. Through a series of hands-on exercises that demonstrate the ease and power of AWS platform, you will learn how to quickly provision powerful storage solutions in minutes.

Course objectives

In this course, you will learn to:

  • Describe the benefits of the core AWS storage services and identify some of their primary
    use cases

  • Select and design an appropriate storage solution according to application and business
    requirements

  • Configure storage resources to work with the broad array of AWS service offerings

  • Select the right method to move data between on-premises workloads and the AWS Cloud

  • Design storage solutions to protect data at rest and in transit

  • Set up monitoring and observability for Cloud storage to gain insight into access patterns,
    utilization, and efficiency

  • Design and optimize storage solutions according to cost, scalability, and performance
    requirements

Intended audience

This course is intended for:

  • Solutions Architects 

  • Cloud Storage Engineers 

  • Cloud Operations Specialists 

  • DevOps Engineers 

Prerequisites

We recommend that attendees of this course have:

Activities

This course includes:

  • Presentations
  • Hands-on labs
  • Demonstrations
  • Group exercises

Course duration / Price

  • 3 days
  • € 1,995.00 (excl. tax) per person

Course outline

  • Day 1

    • Modul 1: Introduction to Cloud Storage
      • Storage in the AWS cloud

      • Designing Well Architected Storage Solutions

      • Designing Durable and Available Storage Solutions

      • Building Accessible and Secure Storage Solutions

    • Modul 2: Designing Object Storage Solutions in AWS  
      • What is object storage?

      • Planning and designing your Amazon S3 deployment

      • Managing Amazon S3

      • Access Control with Amazon S3

      • Hands-On Lab: Exploring S3 Access Control and S3 Object Lambda

    • Modul 3: Implementing Object Storage solutions with S3
      • Cost management and the data lifecycle

      • Managing data transfers into Amazon S3

      • Data protection in Amazon S3

      • Manage objects stored in Amazon S3 at scale

      • Hands-on Lab: Multi-Part Uploads, Batch Operations, and Cross-Region Replication with
        Amazon S3

    • Modul 4: Designing Block storage solutions in AWS  
      • Block storage fundamentals
      • Amazon Elastic Block Store (Amazon EBS)
      • Configuring EBS volume types
      • EC2 and EBS encryption
  • Day 2

    • Module 5: Implementing Block Storage Solutions with Amazon EBS
      • Creating EBS volumes

      • Managing EBS volumes

      • Managing EBS snapshots at scale

      • Hands-On Lab: Managing EBS Volumes: Capacity, Performance, and Data Protection

    • Modul 6: File Storage and Amazon EFS
      • Cloud-based file storage

      • Amazon EFS overview

      • Accessing Amazon EFS

      • Securing and protecting Amazon EFS file systems

      • Hands-On Lab: Using Amazon EFS with AWS Lambda and Amazon ECS

    • Modul  7: Cloud file storage with Amazon FSx  
      • Amazon FSx overview

      • Amazon FSx for Windows File Server

      • Amazon FSx for NetAPP ONTAP

      • Amazon FSx for OpenZFS

      • Amazon FSx for Lustre

      • Choosing an Amazon FSx service

      • Hands-On Lab: Working with FSx for NetApp ONTAP and FSx for OpenZFS

  • Day 3

    • Modul 8: Hybrid and Edge Cloud Storage
      • Hybrid and edge cloud storage overview

      • Introduction to AWS Storage Gateway

      • AWS Storage Gateway architectures

      • AWS Snow Family

    • Modul 9: Moving data to AWS 

      • Moving data to AWS

      • Working with AWS DataSync

      • Implementing AWS Transfer Family

      • Hands-On Lab: Moving Data with Storage Gateway and DataSync

    • Modul 10: Backup and Disaster Recovery with AWS   

      • Designing a data protection strategy

      • AWS Backup

      • Creating backup plans

      • Working with AWS DRS

      • Hands-On Lab: Creating and Restoring Backups with AWS Backup

    • Modul 11: Monitoring, Automating, and Optimizing your AWS Storage  

      • AWS Observability Services

      • Amazon S3 Storage Lens

      • Amazon CloudWatch

      • AWS CloudTrail

      • AWS Config

      • AWS Compute Optimizer

      • Hands-On Lab: Storage Monitoring, Automation, and Optimization

IMPORTANT: Please bring your notebook (Windows, Linux or Mac) to our trainings. If this is not possible, please contact us in advance.

Course materials are in English, on request also in German (if available).
Course language is German, on request also in English.



Neue Termine in Planung!

Continue reading

AWS Certified Data Engineer – Associate

Who should take this exam?

AWS Certified Data Engineer – Associate is designed for those who have experience in data engineering and understand the effects of volume, variety, and velocity on data ingestion, transformation, modeling, security, governance, privacy, schema design, and optimal data store design. You should also have hands-on experience with AWS services.

We recommend that you have the following knowledge before taking this exam:

  • Setup and maintenance of extract, transform, and load (ETL) pipelines from ingestion to destination
  • Application of high-level but language-agnostic programming concepts as required by the pipeline
  • How to use Git commands for source control
  • How to use data lakes to store data
  • General concepts for networking, storage, and compute
  • An understanding of the AWS services for encryption, governance, protection, and logging of all data that is part of data pipelines
  • The ability to compare AWS services to understand the cost, performance, and functional differences between services
  • How to structure SQL queries and how to run SQL queries on AWS services
  • An understanding of how to analyze data, verify data quality, and ensure data consistency by using AWS services

Prerequisites

The recommended experience prior to taking this exam is the equivalent of 2 to 3 years in data engineering or data architecture and at least 1 to 2 years of hands-on experience with AWS services.

Recertification

AWS Certifications are valid for three years. To maintain your AWS Certified status, we require you to periodically demonstrate your continued expertise though a process called recertification. Recertification helps strengthen the overall value of your AWS Certification and shows individuals and employers that your credential covers the latest AWS knowledge, skills, and best practices. Once you have obtained an AWS certification, you will receive a 50% discount on other AWS certification exams.

Developing Generative AI Applications on AWS


  • Aws Advanced Training Partner

  • Aws Premium Consuting Partner

  • 200 Cert

Developing Generative AI Applications on AWS

Please find our upcoming course dates at the end of this page!

Course description

In this advanced two-day course, software developers learn to build and customize AI solutions by using Amazon Bedrock programmatically. Through hands-on exercises and labs, participants will invoke foundation models through Amazon Bedrock APIs, implement Retrieval Augmented Generation (RAG) patterns with Amazon Bedrock Knowledge Bases, and develop AI agents with tool integration. The course focuses on the practical implementation of prompt engineering techniques, responsible AI practices with Amazon Bedrock Guardrails, open source framework integration, and architectural patterns for real-world business applications.

COURSE OBJECTIVES

In this course, you will learn to:

  • Develop generative AI applications using Amazon Bedrock.
  • Design architecture patterns of generative AI applications.
  • Configure Amazon Bedrock APIs to invoke foundation models (FMs) programmatically.
  • Develop agentic AI applications by integrating Amazon Bedrock tools and open source frameworks.
  • Build custom solutions with Retrieval Augmented Generation (RAG) and Amazon Bedrock Knowledge Bases.
  • Integrate open source SDKs with Amazon Bedrock to build business.
  • Optimize model responses by applying prompt engineering techniques.
  • Evaluate generative AI application components.
  • Implement responsible AI practices to protect generative AI.

INTENDED AUDIENCE

This course is intended for:

  • Software developers

PREREQUISITES

We recommend that attendees of this course have:

ACTIVITIES

This course includes:

  • Presentations
  • Demonstrations
  • Hands-on labs
  • Group exercises

COURSE DURATION / PRICE

  • 2 days
  • € 1,500.00 (excl. tax) per person (DE)

Course outline

  • Day 1

    • Module 1: Exploring Components of Generative AI Applications on AWS
      • Understanding generative AI concepts
      • Identifying AWS generative AI stack components
      • Designing generative AI application components
    • Module 2: Programming with Amazon Bedrock
      • Guiding model response generation
      • Using Amazon Bedrock programmatically
      • Hands-on lab: Develop with Amazon Bedrock APIs
      • Hands-on lab: Develop Streaming Patterns with Amazon Bedrock APIs
    • Module 3: Applying Prompt Engineering for Developers
      • Introducing prompt engineering
      • Introducing prompt techniques
      • Optimizing prompts for better results
    • Module 4: Using Amazon Bedrock APIs in Common Architectures
      • Implementing architecture patterns with Amazon Bedrock APIs
      • Exploring common use cases
      • Adding conversational memory to extend context
      • Hands-on lab: Develop Conversation Patterns with Amazon Bedrock APIs
    • Module 5: Customizing Generative AI Responses with RAG
      • Implementing Retrieval Augmented Generation (RAG)
      • Using Amazon Bedrock Knowledge Bases
      • Hands-on lab: Develop Retrieval Augmented Generation (RAG) Applications with Amazon Bedrock Knowledge Bases
    • Module 6: Integrating Open Source Frameworks with Amazon Bedrock
      • Invoking a foundation model in Amazon Bedrock using LangChain
      • Using LangChain for context-aware responses
      • Hands-on lab: Develop a Generative AI Application Pattern using Open Source Frameworks and Amazon Bedrock Knowledge Bases
  • Day 2

    • Module 7: Evaluating Generative AI Application Components
      • Evaluating application components • Evaluating model output
      • Evaluating RAG output
      • Optimizing latency and cost
      • Hands-on lab: Evaluating Retrieval Augmented Generation (RAG) Applications
    • Module 8: Implementing Responsible AI
      • Understanding responsible AI
      • Mitigating bias and addressing prompt misuses
      • Using Amazon Bedrock Guardrails
      • Hands-on lab: Securing Generative AI Applications Using Bedrock Guardrails
    • Module 9: Using Tools and Agents in Generative AI Applications
      • Using tools
      • Understanding AI agents
      • Understanding open source agentic frameworks
      • Understanding agent interoperability
    • Module 10: Developing Amazon Bedrock Agents
      • Implementing Amazon Bedrock Flows
      • Designing Amazon Bedrock Agents
      • Developing Amazon Bedrock Inline Agents
      • Designing multi-agent collaboration
      • Using Amazon Bedrock AgentCore
      • Hands-on lab: Developing Amazon Bedrock Agents Integrated with Amazon Bedrock Knowledge Bases and Guardrails

IMPORTANT: Please bring your notebook (Windows, Linux or Mac) to our trainings. If this is not possible, please contact us in advance.

Course materials are in English, on request also in German (if available).
The Course language is German, on request also in English.


Neue Termine in Planung!

Continue reading

Amazon SageMaker Studio for Data Scientists


  • Aws Advanced Training Partner

  • Aws Premium Consuting Partner

Amazon SageMaker Studio for Data Scientists

Please find our upcoming course dates at the end of this page!

Course description

Amazon SageMaker Studio helps data scientists prepare, build, train, deploy, and monitor machine learning (ML) models quickly. It does this by bringing together a broad set of capabilities purpose-built for ML. This course prepares experienced data scientists to use the tools that are a part of SageMaker Studio, including Amazon CodeWhisperer and Amazon CodeGuru Security scan extensions, to improve productivity at every step of the ML lifecycle.

COURSE OBJECTIVES

In this course, you will learn to:

  • Accelerate the process to prepare, build, train, deploy, and monitor ML solutions using Amazon SageMaker Studio

INTENDED AUDIENCE

This course is intended for:

  • Experienced data scientists who are proficient in ML and deep learning fundamentals

PREREQUISITES

We recommend that attendees of this course have:

  • Experience using ML frameworks
  • Python programming experience
  • At least 1 year of experience as a data scientist responsible for training, tuning, and deploying models
  • AWS Technical Essentials 

ACTIVITIES

This course includes:

  • presentations
  • demonstrations
  • hands-on labs
  • discussions
  • a capstone project

COURSE DURATION / PRICE

  • 3 days
  • € 2,095.00 (excl. tax) per person (DE)

Course outline

  • Day 1

    • Module 1: Amazon SageMaker Studio Setup
      • JupyterLab Extensions in SageMaker Studio
      • Demonstration: SageMaker user interface demo
    • Module 2: Data Processing
      • Using SageMaker Data Wrangler for data processing
      • Hands-On Lab: Analyze and prepare data using Amazon SageMaker Data Wrangler
      • Using Amazon EMR
      • Hands-On Lab: Analyze and prepare data at scale using Amazon EMR
      • Using AWS Glue interactive sessions
      • Using SageMaker Processing with custom scripts
      • Hands-On Lab: Data processing using Amazon SageMaker Processing and SageMaker Python SDK
      • SageMaker Feature Store
      • Hands-On Lab: Feature engineering using SageMaker Feature Store
    • Module 3: Model Development
      • SageMaker training jobs
      • Built-in algorithms
      • Bring your own script
      • Bring your own container
      • SageMaker Experiments
      • Hands-On Lab: Using SageMaker Experiments to Track Iterations of Training and Tuning Models
  • Day 2

    • Module 3: Model Development (continued)
      • SageMaker Debugger
      • Hands-On Lab: Analyzing, Detecting, and Setting Alerts Using SageMaker Debugger
      • Automatic model tuning
      • SageMaker Autopilot: Automated ML
      • Demonstration: SageMaker Autopilot
      • Bias detection
      • Hands-On Lab: Using SageMaker Clarify for Bias and Explainability
      • SageMaker Jumpstart
    • Module 4: Deployment and Inference
      • SageMaker Model Registry
      • SageMaker Pipelines
      • Hands-On Lab: Using SageMaker Pipelines and SageMaker Model Registry with SageMaker Studio
      • SageMaker model inference options
      • Scaling
      • Testing strategies, performance, and optimization
      • Hands-On Lab: Inferencing with SageMaker Studio
    • Module 5: Monitoring
      • Amazon SageMaker Model Monitor
      • Discussion: Case study
      • Demonstration: Model Monitoring
  • Day 3

    • Module 6: Managing SageMaker Studio Resources and Updates
      • Accrued cost and shutting down
      • Updates
    • Capstone
      • Environment setup
      • Challenge 1: Analyze and prepare the dataset with SageMaker Data Wrangler
      • Challenge 2: Create feature groups in SageMaker Feature Store
      • Challenge 3: Perform and manage model training and tuning using SageMaker Experiments
      • (Optional) Challenge 4: Use SageMaker Debugger for training performance and model optimization
      • Challenge 5: Evaluate the model for bias using SageMaker Clarify
      • Challenge 6: Perform batch predictions using model endpoint
      • (Optional) Challenge 7: Automate full model development process using SageMaker Pipeline

IMPORTANT: Please bring your notebook (Windows, Linux or Mac) to our trainings. If this is not possible, please contact us in advance.

Course materials are in English, on request also in German (if available).
The Course language is German, on request also in English.



Neue Termine in Planung!

Continue reading