CloudML Pipeline provides a pipeline that can register Jupyter Notebook and workflow modeler as steps. Its pipeline can also be configured based on the planned intention of the AI model developer(analyst) and it enables optimal analysis jobs for each lifecycle of the AI model.
Users can define arguments for resource setting and machine learning execution of each step. This allows the execution of various types of learning by setting execution option that fits each AI model feature. Additionally, shape analysis of the pipeline helps gain a better understating of each data conversion step as well as enhance the accuracy of predicting the outcome of learning execution.
Log monitoring is provided in real-time during AI model learning, giving users an at-a-glance view of the experiment indicators of each step and allowing them to track learning history.
- Monitoring execution history: Check the status and execution history by step
- Real-time monitoring of execution log: Monitor in real-time cumulative log files during step execution
- Real-time monitoring of experiment metric: Visualize and monitor experiment metrics (accuracy, loss, etc.) real-time
※ Real-time monitoring is applied during the service application stage of CloudML Experiments
- Python model integration feature
- Define each execution step in the pipeline : Creation, analysis/step, option/argument
- Allocate resource by step and provide an option to select execution image