IOTA Distributed Ledger: Beyond Blockchain for Supply Chains

IOTA Distributed Ledger: Beyond Blockchain for Supply Chains

The IOTA Foundation, the organization behind IOTA’s open-source distributed ledger technology built for the Internet of Things, envisions a future where every single item in the global supply chain is tracked and traced using distributed ledgers. This vision is already becoming a reality in East Africa, thanks to the collaboration of the IOTA Foundation and TradeMark East Africa (TMEA). These organizations have come together to meet the challenge of digitizing the export process for Kenya’s flower exporters, airlines and freight forwarders.

TMEA found that for just a single transaction, an African entrepreneur completed an average of 200 communications, including 96 paper documents. The system developed by the IOTA Foundation and TMEA anchors the most important trade documents on the Tangle, a new type of distributed ledger technology that is different from the traditional blockchain model, and shares them with destination country customs. This speeds up the export process and makes African companies more competitive globally.

What is behind this initiative from a technological perspective? That’s what José Manuel Cantera, technical analyst and project manager at the IOTA Foundation, recently shared. From a bird’s eye view, this means using:

  • EPCIS 2.0 data serialization formats for data interoperability
  • IOTA distributed ledgers to record all events that occur in supply chains
  • ScyllaDB NoSQL for scalable, resilient persistent storage

Let’s dive into the details with a closer look at two specific use cases: cross-border trade and end-to-end traceability. But first, Cantera’s perspective on the technical challenges associated with digitizing the supply chain.

Cantera created this talk for the ScyllaDB Summit, a virtual conference to explore what it takes to power instant experiences with massive distributed data sets. Register now (free + virtual) to join us live for ScyllaDB Summit 2023 with experts from Discord, Hulu, Strava, Epic Games, ScyllaDB and more, plus industry leaders on the latest in WebAssembly, Rust, NoSQL, SQL and event streaming trends.

Supply Chain Digitalization: Top Technical Challenges

Cantera began by introducing three of the most pressing technical challenges associated with digitizing the supply chain.

Firstly, there are multiple actors and systems that generate data and integrate data across the supply chain – and it is crucial to verify the identity of each. Suppliers, OEMs, food processors, brands, recycling agents, consumers, ports, carriers, ground carriers, inspectors/authorities, freight forwarders, customs, retailers, repairers, etc. are all involved and all must be verified.

Secondly, there are several relationships across all these actors, and these relationships cross national borders with no central anchor and no single source of truth. In addition to business-to-business and business-to-consumer, there are also business-to-state and government-to-state relationships.

Thirdly, there are different functional needs linked to maintaining trust between the various actors through verifiable data. Traceability is key here. It is an enabler of compliance, product authenticity, transparency and provenance in terms of different types of applications. For example, traceability is essential for ethical procurement, food safety and effective recall.

See also  Wemade announces partnership with Space and Time to power blockchain and gaming services

Use case 1: Cross-border trade

For its first example, Cantera turns to cross-border trading operations.

“This is a multi-layered domain, and there are many different problems that need to be solved in different places,” he warns, before sharing a diagram that reins in the enormous complexity of the situation:

The key streams here are:

  • Financial procedures: The pure financial transaction between the two parties
  • Trading procedures: Any kind of document related to a commercial transaction
  • Transport procedures: All details about the transport of the goods
  • Regulator procedures: The many different documents that must be exchanged between importers and exporters, as well as with public authorities in the relationship between companies and authorities

So how is the IOTA Foundation working to optimize this complex and multi-layered domain? Cantera explains, “We allow different actors, different government agencies and private actors (traders) to share documents and verify documents in one shot. When a shipment moves between East Africa and Europe, all the trade certificates, all the documents can be verified in one shot by the different actors, and the authenticity and provenance of the documents can be properly traced. And as a result, the agility of the trading processes is improved. It is more efficient and more effective.”

All the actors in the flow visualized above share the documents through the infrastructure provided by IOTA’s distributed ledger using an architecture detailed by the second use case below.

Use Case 2: End-to-End supply chain traceability

In addition to dealing with document sharing and verification for cross-border trade, there is another challenge: tracing the origin of the traded goods. Cantera emphasizes that when we think about traceability, we must think about the definition of traceability given by the United Nations: “The ability to identify and trace the history, distribution, location and application of products, parts and materials, to ensure the reliability of sustainability requirements, in the areas human rights, labor (including health and safety), environment and anti-corruption.”

In principle, traceability implies the ability to follow history. In the case of commercial goods, this means knowing what has happened to the item in question – not only the transport, but also the origin. If one of the parties involved in the supply chain makes a claim about sustainability, safety, etc., the validity of that claim must be verifiable.

Consider, for example, a seemingly simple bag of crisps. A farmer sells potatoes to a food processor, which turns the potatoes into a bag of crisps. When growing the potatoes, the farmer used a fertiliser, which was produced by another producer and contained raw materials from another farmer. And when you convert potatoes into potato chips, the food processor uses oils from yet another source. And so on and so on. The history of all these things – the potatoes, the fertilizer, the oils, the bag containing the chips, and so on – must be known for traceability on that bag of chips.

See also  Is now the time to start a Web3 career?

All of these details – from when the potatoes were harvested to the fertilizer used, where the fertilizer came from, and so on – are considered critical events. And each of these critical tracking events has key data elements that describe the who, what, when, where, why, and even how.

How IOTA tackled its biggest technical challenges

The IOTA Foundation used several core technologies to address the biggest technical challenges across these use cases:

  • Data interoperability
  • Scalable data stores
  • Scalable, permissionless, emotionless distributed ledger technology

Data interoperability

In these and similar use cases, many different actors must exchange data, so it requires a standard syntax, with reference vocabularies, for semantic interoperability. In addition, it all needs to be extensible to accommodate the specialized needs of different industries (for example, the automotive industry and the seafood industry have distinctly different nuances). Some of the key technologies used here include W3C with JSON-LD, GS1 with EPCIS 2.0 and UN/CEFACT providing edi3 reference data models. IOTA also used sector standards for data interoperability; for example DCSA (maritime transport), MOBI (connected vehicles and IoT commerce) and Global Dialogue on Seafood Traceability to name a few.

It is worth noting that IOTA was deeply involved in the development of EPCIS 2.0, which is a vocabulary and data model (plus a JSON-based serialization format and associated REST APIs). It enables stakeholders to share transactional information about the movement and status of objects (physical or digital), identified by keys. Using this model, events are described as follows:

And it translates to JSON-LD in a format like this:

Scalable data stores with ScyllaDB NoSQL

Establishing a scalable data warehouse for all critical data related to each supply chain event was another challenge. Cantera explained: “If we track every single item in the supply chains, we have to store a lot of data, and this is a big data problem. And here ScyllaDB offers many advantages. We can scale our data very easily. We can keep the data at a fine level of granularity in a long time. Not only that, but we can also combine the best of the NoSQL and SQL worlds because we can have robust schemas to have robust data and reliable data.”

Cantera then went on to detail ScyllaDB’s role in this architecture, giving an example from the automotive supply chain. Consider an OEM with 10 million cars produced per year. Assume that:

  • Each car has 3,000 traceable parts.
  • Each part can have a lifespan of 10 years.
  • Each part can generate 10 business events.
See also  Avalanche Launches Arcad3 To Help Bring Web2 Developers To Blockchain Gaming

This means around 300 billion active business events to store in ScyllaDB. Another example: Consider a maritime transport operator that moves 50 million containers per year. Given 10 events per container and five years of operation, Cantera estimates approximately 2,500,000 active events here – just from the EPCIS 2.0 event repository. But there are also several layers that require this level of data scalability.

He concludes his discussion of this challenge with a look at the many applications for ScyllaDB across this initiative:

  • Event storage (EPCIS 2.0, DCSA, …)
  • Item-level tracking
  • Inventory
  • Catalog
  • Any DLT Layer 2 data storage

Scalable, permissionless, Feelless Distributed Ledger technology

Scalable, permissionless and emotionless distributed ledger technology also played a key role in the solution developed by the IOTA Foundation. For this, it used IOTA’s distributed ledger in combination with protected storage such as IPFS to provide the functionality around data and document verifiability, auditability and immutability within these peer-to-peer interactions.

For example, say you hire a specific carrier to move goods. When the activity starts, the carrier can generate an event that the goods have started moving through the supply chain, and these events are committed to the IOTA distributed ledger. More specifically, the originator of the event generates a transaction on the distributed ledger, and that transaction can later be used by any participant in the supply chain to verify the authenticity of the event. And once the event is committed, the originator can no longer change it. If the event was changed, the verification step would fail and the supply chain partners would understandably be concerned.

Here’s how it all comes together:

Tip: For Cantera’s block-by-block tour of this reference architecture, watch the video below, starting at 17:15.

Conclusions

Digitizing the supply chain is fraught with technical challenges, so it is not surprising that a non-traditional mix of technologies is required to meet the IOTA Foundation’s highly specialized needs. Cantera sums it up nicely:

“It requires interoperability – which means it’s important to adapt to the open standards, EPCIS 2.0, the decentralized ID that comes from W3C verifiable credentials. It requires a reference architecture to guarantee that semantic interoperability and some reusable building blocks are used. It requires decentralization, and decentralization of data requires distributed ledger technology – especially public, permissionless and emotionless distributed layers like IOTA supplemented by IPFS, which rely more and more on decentralized applications. It also requires data scalability and availability, and ScyllaDB is the perfect partner here. Last but not least, it requires reliable data sharing with technologies such as decentralized IDs, distributed ledger technology and peer-to-peer.”

GroupMade with Sketch.

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *