One can build one’s machine learning models using IBM Watson while using the data in an enterprise repository to train the model. After creating the model, one can run it through the CoreML converter tools and insert it in their Apple app. This allows the two partners to make the apps created under the partnership even smarter with machine learning.
Apple developers seek a quick and easy way to build these apps and leverage the cloud where it’s delivered.
IBM also announced a cloud console to simplify the connection between the Watson model building process and inserting that model in the application running on the Apple device. The app can share data back with Watson and improve the machine learning algorithm running on the edge device in a classic device-cloud partnership.
The app runs in real-time and one does not need to be connected to Watson. However, as different parts on the device are classified, the data gets collected. Further, when one connects to Watson on a lower bandwidth, they can feed the data back in order to train their machine learning model and make it even better.
Let’s take an instance where the app assists field technicians’ iPhones or iPads to scan the electrical equipment they are inspecting to automatically detect any anomalies. This would eliminate the need to send that data to IBM’s cloud computing data centers to process, thus speeding up the amount of time it takes to detect equipment bugs to near real-time.
To know more about this project in detail visit Apple's official blog post.