Last week MHL (Material Handling & Logistics) magazine publish a piece I wrote on the use of blockchain technology to provide high-value business benefits in supply chain management. Some of these include:
- Creation of an immutable, portable record of all hand-offs between parties that is not held hostage by any one vendor or party
- Complete, portable chain-of-custody useful for every from regulatory compliance to management of product recalls
- (My favorite): Use of Smart Contracts to create distributed automated intelligence to do things like automatically triggering notifications and invoicing on milestone achievements (CFOs would love that). This could be accelerated even faster by using smart sensors.
My focus was on ideas that can truly be implement today, integrating with existing systems for milestone management and payment processing (easing adoption and integration). Please visit the MHL magazine to read the full article. You can also download the infographic they have hosted with the article here.
BTW, these same concepts can be applied anywhere serialization is needed: digital media sales, regulated product delivery, clinical trials, etc.
Fun fact: You can likely guess when I originally wrote the article based on the BTC price quoted for the music download. BTC has appreciated quite a bit since then. 😉
This week, at the Washington DC Spark Interactive, Savi Engineering shared some of our work on using Spark Streaming and Expert Systems technology (Drools) to analyze the Industrial IoT in near-real time.
At Savi, we use a hybrid Lambda Architecture (see my post on why Lambda is so important). By “hybrid” we mean that unlike pure Lambda Architectures, we cannot restate the past 100% as we have already notified humans of critical IoT events (e.g., theft, safety risk). We can only enrich and auto-resolve these as more data becomes available. You can find tips on how do this — in general with streaming technologies and specifically with Spark — in the following presentation. You can also learn more about tackling real-world IoT challenges:
In addition, at Savi we combine fully explicit rules with real-time machined learning algorithms to perform risk and performance analytics in near-real time (see my post on the differences in focus areas between our Data Engineers and Data Scientists). James Nowell of our Engineering team provided a great presentation on how we run Drools inside Spark RDDs (yes–Drools, we do this without performance penalties) to create linear-scale expert systems to analyze all that IoT as if we were an omniscient human. You can find his presentation here:
In future presentations, we will expand on areas such as:
- The differences in use of Spark (using the same data) between Data Scientists and Engineers
- How we scale machine learning algorithms for real-time, sub-second execution (thousands of times per second)
- Creating a DAG that combines hardware device edge intelligence with cloud-based intelligence
If you like what you see here, Savi is hiring. Take a look at here.