Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 2 Current »

1. Overview

  • Objective: REAL-510 - Getting issue details... STATUS

  • Scope: setGetApi, API Gateway , AWS API Gateway , Amazon Simple Queue Service

2. System Components

  • Web UI: Client interface for uploading data.

  • Kairos Processing Modules: Details on kairos-file-process, kairos-product-manager, and kairos-products that prepare data for storage.

  • API Gateway: Manages client requests and routes them to SQS for processing.

  • Amazon SQS: Queues data to ensure order and manage asynchronous processing.

  • setGetApi: Retrieves and processes data from SQS and performs validation.

  • Redis: Serves as both a temporary and long-term data storage layer.

  • Scheduler: Nightly task that backs up Redis data to MongoDB.

  • MongoDB: Stores a secondary copy of data for archival purposes.

3. Data Flow

  • Describe the sequential data journey:

    • From Web UI upload to Kairos modules for processing.

    • API Gateway routes processed data to SQS.

    • setGetApi retrieves data from SQS, processes it, and stores it in Redis.

    • The Scheduler copies Redis data to MongoDB each night.

4. Data Persistence and Storage

  • Redis: Long-term storage and temporary data retrieval cache.

  • MongoDB: Archival storage updated nightly.

5. Architecture

for Set Api

Screenshot from 2024-11-15 14-57-48.png

for Get Api

Request from KP(webhook Key)→ setGetApi (retrieves data from Redis) → Response to KP.

6. Sample Setup of API gateway and SQS through lamda in AWS for Testing

A. AWS API Gateway Setup

  1. Action:

    • Created a POST API endpoint to accept data from external sources.

    • Configured a Mapping Template to validate and transform the input request into the required JSON format.

    • Integrated API Gateway with the SQS queue using an IAM role for secure communication.

AWS Lambda Setup

  1. Action:

    • Created a Python-based Lambda function to process messages from the SQS queue.

    • Configured batch processing to handle multiple messages in a single invocation for efficiency.

    • Added error-handling logic to ensure failed messages are redirected to the dead-letter queue.

    • Included logic to format data and send it to Redis.

  2. Testing:

    • Triggered the Lambda function using messages from the SQS queue.

    • Verified data processing and ingestion into Redis.

AWS SQS Setup

  1. Action:
    Initial Testing with Standard Queue:

    • For initial functionality testing, a Standard Queue was created to valvalidate the end-to-end flow from API Gateway to SQS and from SQS to Lambda.

    • This queue was used to ensure seamless integration and verify that no message loss occurred during processing.

    • Configured a visibility timeout to ensure that messages are not processed multiple times.

7. Advantages

  • Storing Multiple Webhook Keys:
    The ability to store an arbitrary number of webhook keys enhances flexibility, allowing for different systems or endpoints to receive the same data. This improves the extensibility and scalability of your integration.

  • Reduced Load on Kairos-Products and ActiveMQ:
    By handling webhook keys and managing the distribution of data within this module, the system offloads processing from the Kairos-products module and internal ActiveMQ. This reduces the risk of overloading these critical components, improving overall performance and reliability.

  • Configurable TTL (Time-To-Live):
    The TTL configuration on both the account and flow level allows for greater control over how long data is retained before it expires or is removed. This flexibility is especially useful for handling transient data or controlling cache behavior on an account-level or flow-level basis, ensuring efficient memory and resource usage.

  • No labels