Distributed Data Management Architecture - Wikipedia
Distributed Data Management Architecture (DDM) is IBM's open, published software architecture for creating, managing and accessing data on a remote computer. DDM was initially designed to support record-oriented files; it was extended to support hierarchical directories, stream-oriented files, queues, and system command processing; it was further extended to be the base of IBM's Distributed Relational Database Architecture (DRDA); and finally, it was extended to support data description and conversion. Defined in the period from 1980 to 1993, DDM specifies necessary components, messages, and protocols, all based on the principles of object-orientation. DDM is not, in itself, a piece of software; the implementation of DDM takes the form of client and server products. As an open architecture, products can implement subsets of DDM architecture and products can extend DDM to meet additional requirements. Taken together, DDM products implement a distributed file system.
Here’s a tour of the pre-alpha demo release of GNOME Online Desktop included in Fedora 8. Learn more about what it does and how you can get involved in the project.
There’s some talk on desktop-devel-list about exactly what “online desktop” means, and in private mail I got a good suggestion to focus it such that end users would understand. &#…
The Next Generation of the Enterprise WAN: From WAN to SD-WAN to Next-Gen WAN
With all the changes to enterprise WAN needs, enterprises are re-evaluating their WANs and seeking Next-Generation WANs that align with the latest set of enterprise challenges.
What SQL could learn from Elasticsearch Query DSL | Quesma Database Gateway
At Quesma we help customers to innovate faster by re-shaping the way applications are built and connected to their DBs. Quesma database gateway enables development teams to modernise and evolve application architecture.
Driplang: triggering when events happen (or don't)
This post describes multiple ways I’ve seen projects handle event triggering in the past and suggests a minor tweak that I believe will greatly benefit proje...
Data processing modes: Streaming, Batch, Request-Response - Nussknacker
Stream processing is a method of processing data where input ingestion and output production is continuous. Batch processing refers to processing a bounded set of data (batch) and producing output for the entire batch
How We Built The Tech That Powers Our Serverless Cloud
Today we're releasing the container orchestrator that powers the Bismuth Cloud platform - our homegrown system that enables us to deploy your applications in seconds.
Discover OpenTelemetry Metrics: Understand types, instruments & their role in performance monitoring. Learn best practices & how to get started with Checkly.
Unstructured data holds valuable information about codebases, organizational best practices, and customer feedback. Here are some ways you can leverage it with RAG, or retrieval-augmented generation.
Rolling versions: The new standard for API versioning | The Webhooks Blog
It’s beautiful to see the industry finally adopt a superior technique for API versioning: rolling versions. Last week, Sequence adopted rolling versions as the versioning scheme for their API. Ten years ago, this scheme was unheard of, but fast-forward to this day, and it is fast becoming ubiquitous thanks to Stripe for inventing and popularising this technique with their article — APIs as infrastructure: future-proofing Stripe with versioning.
Preface This article is an introduction the Odin Programming Language. It is aimed at people who know a bit of programming, but have never touched Odin. It is not a reference guide, rather I try to keep things informal and talk about what I think are important aspects of the language. There will be some notes on differences to C/C++, as Odin in many ways tries to be better C. If you enjoy this article and want to support me, then you can do so by becoming a patron.