Found 6 bookmarks
Newest
JSON Schema validation | Godspeed Docs
JSON Schema validation | Godspeed Docs
The Framework provides request and response schema validation
Request schema validation​ We have the ability to define inputs and their types in our request schema, such as path parameters, query parameters, and request body. This allows the framework to validate whether the API has received the specified inputs in the expected types. Whenever an API is triggered, AJV (Another JSON Schema Validator) verifies the request schema against the provided inputs. If the defined schema matches the inputs, it allows the workflow to execute. Otherwise, it throws an error with a status code of 400 and a descriptive message indicating where the schema validation failed.
Response schema validation​ Just like request schema validation, there's also response schema validation in place. In this process, the framework checks the response type, validates the properties of the response, and ensures they align with the specified types. The process of response schema validation involves storing the response schema, enabling the workflow to execute, and checking the response body along with its properties for validation. Response schema validation includes two cases Failure in Workflow Execution Successful Workflow Execution but Fails in Response Schema Validation
If the response schema validation fails api return with 500 internal server error
In the case of failed request schema validation, the APIs respond with a status of 400 and a message indicating a "bad request." Conversely, if the response schema validation encounters an issue, the APIs return a status of 500 along with an "Internal Server Error" message.
Event with response and request schema validation​ http.post./helloworld: fn: helloworld params: - name: path_params in: path required: true schema: type: string - name: query_params in: query required: true schema: type: string body: content: application/json: schema: type: object required: [name] properties: name: type: string responses: 200: content: application/json: schema: type: object required: [name] properties: name: type: string
·godspeed.systems·
JSON Schema validation | Godspeed Docs
Design Principles | Godspeed Docs
Design Principles | Godspeed Docs
Three fundamental abstractions
Schema driven data validation​ We follow Swagger spec as a standard to validate the schema of the event, whether incoming or outgoing events (HTTP), without developer having to write any code. In case of database API calls, the datastore plugin validates the arguments based on the DB model specified in Prisma format. The plugins for HTTP APIs or datastores offer validation for third-party API requests and responses, datastore queries, and incoming events based on Swagger spec or DB schema. For more intricate validation scenarios, such as conditional validation based on attributes like subject, object, environment, or payload, developers can incorporate these rules into the application logic as part of middleware or workflows.
Unified datastore model and API​ The unified model configuration and CRUD API, which includes popular SQL, NoSQL stores including Elasticgraph (a unique ORM over Elasticsearch), offer standardized interfaces to various types of datastores, whether SQL or NoSQL. Each integration adapts to the nature of the data store. The Prisma and Elasticgraph plugins provided by Godspeed expose the native functions of the client used, giving developer the freedom to use the universal syntax or native queries.
Authentication​ Authentication helps to identify who is the user, and generate their access tokens or JWT token for authorized access to the resources of the application. The framework gives developers full freedom to setup any kind of authentication. For ex. they can setup simple auth using the microservice's internal datastore. Or they can invoke an IAM service like ORY Kratos, AWS Okta, or an inhouse service. They can also add OAUTH2 authentication using different providers like Google, Microsoft, Apple, Github etc. using pre-built plugins, or import and customize an existing HTTP plugin like Express, by adding PassportJS middleware.
Authorization​ Authorization is key to security, for multi-tenant or variety of other use cases. The framework allows neat, clean and low code syntax to have a fine grained authorization in place, at the event level or workflow's task level, when querying a database or another API. Developers define authorization rules for each event or workflow task using straightforward configurations for JWT validation or RBAC/ABAC. For more complex use cases, for ex. where they query a policy engine and dynamically compute the permissions, they can write workflows or native functions to access the datasources, compute the rules on the fly, and patch the outcome of that function into a task's authz parameter. These rules encompass not only access to API endpoints but also provide fine-grained data access within datastores, for table, row and column level access. The framework allows seamless integration with third-party authorization services or ACL databases via the datasource abstraction.
Autogenerated Swagger spec​ Following the principles of Schema Driven Development, the event spec of the microservice can be used to auto-generate the Swagger spec for HTTP APIs exposed by this serice. The framework provides autogenerated Swagger documentation using CLI.
Autogenerated CRUD API​ The framework provides autogenereated CRUD APIs from database model written in Prisma format. Generated API's can be extended by the developers as per their needs. We are planning auto generate or Graphql and gRpc APIs, and may release a developer bountry for the same soon.
Environment variables and configurations​ The framework promites setting up of environment variables in a pre-defined YAML file. Though the developer can also allow access by other means via a .env file or setting them up manually. Further configurations are to be written in /config folder. These variables are accessible in Other configuration files Datasource, event source and event definitions Workflows
Log redaction​ The framework allows developer to specify the keys that may have sensitive information and should never get published in logs by mistake. There is a centralized check for such keys before a log is about to be printed.
Telemetry autoinstrumentation using OTEL​ Godspeed allows a developer to add auto-instrumentation which publishes logs, trace and APM information in OTEL standard format, supported by all major observability backends. The APM export captures not just the RAM, CPU information per node/pod/service, but also the latency information of the incoming API calls, with broken down spans giving breakup of latency across the calls to datastores or external APIs. This helps to find out exact bottlenecks. Further the logs and trace/spans are correlated to find out exactly where the error happened in a request spanning multiple microservices with each calling multiple datasources and doing internal computation. Developer can also add custom logging, span creation and BPM metrics at task level. For ex. new user registration, failed login attempt etc.
·godspeed.systems·
Design Principles | Godspeed Docs
Reverse Engineer an API using MITMWEB and POSTMAN and create a Swagger file (crAPI)
Reverse Engineer an API using MITMWEB and POSTMAN and create a Swagger file (crAPI)
Many times when the we are trying to Pentest an API we might not get access to Swagger file or the documentations of the API, Today we will…
Many times when the we are trying to Pentest an API we might not get access to Swagger file or the documentations of the API, Today we will try to create the swagger file using Mitmweb and Postman.
Man in The Midlle Proxy (MITMweb)
run mitmweb through our command line in Kali
and as we can see it starts to listen on the port 8080 for http/https traffic, and we will make sure that its running by navigating to the above address which is the localhost at port 8081
and then we will proxy our traffic thorugh Burp Suite proxy port 8080 because we already has mitmweb listening for this port (make sure Burp is closed)
and then we will stop the capture and use mitmproxy2swagger to analyse it
·medium.com·
Reverse Engineer an API using MITMWEB and POSTMAN and create a Swagger file (crAPI)