The Henry Ford Museum and GE have reignited their storied roots, joining forces to protect priceless artifacts of the 20th Century. Now with the benifits internet connected devices, it is possible to do better analysis and maintain the quality of the artifacts going forward

What it does

Using material science a neural network, a model was created to help understand and process the temperature and humidity data gathered and stored in Predix. Temperature and humidity data on its out cannot provide the necessary insight to make actionable decisions about the current state the environment the artifacts are in.

The trained neural network converts the temperature and humidity data into expected life years and is fed into the front-end interface. Curators interact with this data in two ways. The data is visually availible but is also monitored in the back-end. If the projected/expected state of the system divereges, then the system takes to actions. First, a corrective action is taken by automating the adjustment of the on-site thermostate using a servo and an arduino. If there is a larger system failure and this corective action does not bring the system back into the expected range than the staff is alerted by text message or push alert.

How we built it

We started by understand the context of the problem we wanted to solve. After interviewing the staff in charge of the area and seing it firsthand we broke off into groups to begin work. Research into paper and how it degrades with respect to temperature and humidity was modeled and fed into a custom neural network. The data from our research provided the context for the neural network to train itself on and provide data on any combination of temperature and humidity within range.

The data was modeled in rStudio to visualize the what areas are safe for the artifacts to be in. A front-end interface was created to provide a means of seeing the current state of the system. Using a nodejs based api proxy, the data from the predix system was transformed into the proper object models needed for each stange of the project. Lastly, an arduino sketch was written to demonstrate how a microcontroller could be used to physically make corrective actions on behalf of the staff.

Challenges we ran into

The learning curve for the Predix platform was steeper than we anticipated. Initially we had intended to work directly within the system but instead decided to inovate on our own in an environment we were comfortable in, and planned to incorporate our work into Predix at a later time.

Accomplishments that we're proud of

Our team had prior experience in each aspect of our solution and the endproduct was better off because of it.

What we learned

Our team was largly made up of people who had never met prior to this hackathon. We learned to walk together as a team, which wasn't hard because this team is made up of awesome creative open minded people anyway. We learned about a part of the museum that many never see or know about and the current and pressing problem that part is faced with.

What's next for DIHack10

A lifetime of inovation and friendship I bet

Built With

predix cloud-foundry matlab rStudio html5 angular d3 neataptic arduino material-science node.js

Built With

Share this project:


posted an update

Jen did the front end, and worked with Larry to make sure his work pushed to a user friendly interface with relevant info for curators/museum staff. Jen was an indispensible team member with her ability to apply her chemistry knowledge in assisting with the likely material candidates when classifying artifact composition. The work she did with that was critical with pivoting from my focus on synthetic plastic off-gassing as the most detrimental to artifact RUL; when I saw her grid that cellulose was a component to a few artifacts I disregarded as sources of degradation catalysts, we were able to adapt quickly.

Log in or sign up for Devpost to join the conversation.