inspiration
Being an IT service provider for customs and security we can see the pain points our airline customers suffer due to missing data elements and poor data quality. These cause costly delays and stress in the supply chain and require vast amounts of time and money to fix them. To address this, we would like to take advantage of the current AI technology that would provide the human eye with some much needed help. As many governments worldwide have implemented new security measures to perform and approve pre-load checks, clean high-quality data containing valid harmonized customs commodity codes to name a few ,will be key to get necessary approvals to avoid customs delays and increase supply chain efficiency.
what it does
We envision an automated validation process that could be connected to One Record that would check, improve, and update data quality. This solution would suggest, modify, or prompt the user for clarification when poor data quality is detected. This improvement would contain an AI component that would learn over time and increase autocorrection capabilities. Through a simple GUI the AI would scan, check, and verify the record and prompt human response for those records that the machine would not be able to autocorrect. This solution would greatly increase the high quality of data being shared throughout the supply chain and effectively avoid costly delays.
How we built it
Our development team was keen to use, Java development environment, AI development tools to train with actual data provided by CHAMP cargo systems. Installed an instance of the NEONE server on developer machines.
Challenges we ran into
During our initial testing we discovered the NEONE server had some issues and required some patch up to clear things up. This was quickly done by the NEONE team upon our discovery. After this point we must admit things went smooth. Another snag was when We discovered a major difference between the documentation of the ontology and the APIs. Which took several hours for us to reconfigure to allow the machine to read and populate the right fields to match.
Accomplishments that we're proud of
We are thrilled to see out AI clearly interpreting the data and updating the required fields. This was made possible by using real data that we structed into 1R models in order to perform activities with our solution. It was cool to be able use various APIs and technologies to make this happen.
What we learned
One of the key takeaways for us was 1R is a living product which relies on the contribution of the community because each partner is able to test and feedback issues to make this server better for us all.
What's next for Scrub-x data cleaners
We will continue to work with the machine and upload more cargo data to help train it to be as accurate as possible. We will work on selecting business process that make sense to add our technology to help make this available to our customers and beyond.


Log in or sign up for Devpost to join the conversation.