Microsoft Azure AI Fundamentals AI-900 Practice Question
Your company has trained a machine learning model and needs to process a large dataset to generate predictions for analysis. The predictions are not required instantly and can be computed without immediate response.
Which deployment option in Azure Machine Learning should you recommend?
Deploying the model for batch inference is the most appropriate option in this scenario. Batch inference is designed for processing large volumes of data where predictions do not need to be immediate. It allows for asynchronous processing, making it efficient and cost-effective for non-time-sensitive tasks. Real-time endpoint deployment, Azure Kubernetes Service, and Azure Container Instances are better suited for scenarios requiring immediate predictions or handling live data streams, which is not the case here.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is batch inference?
Open an interactive chat with Bash
How does batch inference compare to real-time endpoint deployment?
Open an interactive chat with Bash
What are the benefits of using Azure Machine Learning for batch inference?
Open an interactive chat with Bash
Microsoft Azure AI Fundamentals AI-900
Describe Fundamental Principles of Machine Learning on Azure
Your Score:
Report Issue
Bash, the Crucial Exams Chat Bot
AI Bot
Loading...
Loading...
Loading...
IT & Cybersecurity Package Join Premium for Full Access