# MindSpore Serving
[查看中文](./README_CN.md)
<!-- TOC -->
- [MindSpore Serving](#mindspore-serving)
- [Overview](#overview)
- [Installation](#installation)
- [Installing Serving](#installing-serving)
- [Configuring Environment Variables](#configuring-environment-variables)
- [Quick Start](#quick-start)
- [Documents](#documents)
- [Developer Guide](#developer-guide)
- [Community](#community)
- [Governance](#governance)
- [Communication](#communication)
- [Contributions](#contributions)
- [Release Notes](#release-notes)
- [License](#license)
<!-- /TOC -->
## Overview
MindSpore Serving is a lightweight and high-performance service module that helps MindSpore developers efficiently
deploy online inference services in the production environment. After completing model training on MindSpore, you can
export the MindSpore model and use MindSpore Serving to create an inference service for the model.
MindSpore Serving architecture:
<img src="docs/architecture.png" alt="MindSpore Architecture" width="600"/>
MindSpore Serving includes two parts: `Client` and `Server`. On a `Client` node, you can deliver inference service
commands through the gRPC or RESTful API. The `Server` consists of a `Main` node and one or more `Worker` nodes.
The `Main` node manages all `Worker` nodes and their model information, accepts user requests from `Client`s, and
distributes the requests to `Worker` nodes. `Servable` is deployed on a worker node, indicates a single model or a
combination of multiple models and can provide different services in various methods. `
On the server side, when [MindSpore](#https://www.mindspore.cn/) is used as the inference backend,, MindSpore Serving
supports the Ascend 910/310P/310 and Nvidia GPU environments. When [MindSpore Lite](#https://www.mindspore.cn/lite) is
used as the inference backend, MindSpore Serving supports Ascend 310, Nvidia GPU and CPU environments. Client` does not
depend on specific hardware platforms.
MindSpore Serving provides the following functions:
- gRPC and RESTful APIs on clients
- Pre-processing and post-processing of assembled models
- Batch. Multiple instance requests are split and combined to meet the `batch size` requirement of the model.
- Simple Python APIs on clients
- The multi-model combination is supported. The multi-model combination and single-model scenarios use the same set of
interfaces.
- Distributed model inference
## Installation
For details about how to install and configure MindSpore Serving, see the [MindSpore Serving installation page](https://www.mindspore.cn/serving/docs/en/master/serving_install.html).
## Quick Start
[MindSpore-based Inference Service Deployment](https://www.mindspore.cn/serving/docs/en/master/serving_example.html) is
used to demonstrate how to use MindSpore Serving.
## Documents
### Developer Guide
- [gRPC-based MindSpore Serving Access](https://www.mindspore.cn/serving/docs/en/master/serving_grpc.html)
- [RESTful-based MindSpore Serving Access](https://www.mindspore.cn/serving/docs/en/master/serving_restful.html)
- [Services Provided Through Model Configuration](https://www.mindspore.cn/serving/docs/en/master/serving_model.html)
- [Services Composed of Multiple Models](https://www.mindspore.cn/serving/docs/en/master/serving_model.html#services-composed-of-multiple-models)
- [MindSpore Serving-based Distributed Inference Service Deployment](https://www.mindspore.cn/serving/docs/en/master/serving_distributed_example.html)
For more details about the installation guide, tutorials, and APIs,
see [MindSpore Python API](https://www.mindspore.cn/serving/docs/en/master/server.html).
## Community
### Governance
[MindSpore Open Governance](https://gitee.com/mindspore/community/blob/master/governance.md)
### Communication
- [MindSpore Slack](https://join.slack.com/t/mindspore/shared_invite/zt-dgk65rli-3ex4xvS4wHX7UDmsQmfu8w) developer
communication platform
## Contributions
Welcome to MindSpore contribution.
## Release Notes
[RELEASE](RELEASE.md)
## License
[Apache License 2.0](LICENSE)
没有合适的资源?快使用搜索试试~ 我知道了~
samples-master是一个面向开发者的在线工具
共484个文件
h:113个
py:105个
cc:85个
需积分: 0 0 下载量 86 浏览量
2023-05-17
23:38:52
上传
评论
收藏 1.59MB RAR 举报
温馨提示
大家好,我今天想和你们聊一下关于交流互动场景相关的背景知识。我们生活在一个信息化时代,每天都在面对各种各样的交流互动场景,比如说在社交媒体上互动,使用各种聊天软件来交流等等。 对于这样的场景,我们需要一些好用的工具来帮助我们更加高效地交流。而在这方面,有一个非常好用的工具就是samples-master。那么,什么是samples-master呢? 简单来说,samples-master是一个面向开发者的在线工具,可以帮助开发者更加方便地进行前端页面布局和设计。通过samples-master,我们可以轻松实现响应式布局、自适应布局等各种不同的布局方式,同时还可以使用大量的组件和插件来帮助开发者完成各种不同的功能。 除此之外,samples-master还有一个非常神奇的功能,就是可以通过代码画出各种复杂的图形。对于我们这些设计白痴来说,这绝对是一个福音,让我们可以轻松地实现各种想象中的效果。 那么,如何使用samples-master呢?相信这对于不少开发者来说,可能存在一定的困难。不过,不用担心,对于这件事情,我们可以通过一些简单的交流互动来帮助大家解决。如果你有任何关于s
资源推荐
资源详情
资源评论
收起资源包目录
samples-master是一个面向开发者的在线工具 (484个子文件)
memcpy_s.c 21KB
memset_s.c 20KB
strcpy_s.c 14KB
vsnprintf_s.c 5KB
strncpy_s.c 5KB
strncat_s.c 5KB
wcsncat_s.c 5KB
secureprintoutput_w.c 5KB
snprintf_s.c 4KB
wcscat_s.c 4KB
wcsncpy_s.c 4KB
strcat_s.c 4KB
strtok_s.c 4KB
memmove_s.c 4KB
wcscpy_s.c 3KB
vsscanf_s.c 3KB
wcstok_s.c 3KB
wmemcpy_s.c 3KB
vswscanf_s.c 3KB
secureprintoutput_a.c 3KB
wmemmove_s.c 3KB
vsprintf_s.c 3KB
vscanf_s.c 3KB
vfscanf_s.c 2KB
vfwscanf_s.c 2KB
gets_s.c 2KB
sprintf_s.c 2KB
sscanf_s.c 2KB
vwscanf_s.c 2KB
swscanf_s.c 2KB
fscanf_s.c 2KB
vswprintf_s.c 2KB
fwscanf_s.c 2KB
scanf_s.c 2KB
wscanf_s.c 2KB
securecutil.c 2KB
swprintf_s.c 2KB
secureinput_w.c 1KB
secureinput_a.c 751B
test_parse_restful.cc 44KB
http_process.cc 37KB
mindspore_model_wrap.cc 33KB
test_start_preprocess_postprocess.cc 32KB
proto_tensor.cc 31KB
distributed_model_loader.cc 28KB
test_init_config_on_start_up.cc 26KB
test_master_worker.cc 24KB
servable_register.cc 20KB
client.cc 18KB
worker.cc 14KB
types.cc 13KB
dispacther.cc 13KB
log_adapter.cc 12KB
serving_py.cc 12KB
task_queue.cc 12KB
model_thread.cc 12KB
test_shared_memory.cc 11KB
work_executor.cc 11KB
worker_py.cc 11KB
grpc_notify.cc 11KB
log.cc 11KB
context.cc 11KB
test_start_worker.cc 11KB
serialization.cc 10KB
shared_memory.cc 10KB
status.cc 9KB
tensor_py.cc 9KB
test_distributed_inference.cc 9KB
restful_server.cc 9KB
inference.cc 8KB
worker_agent.cc 8KB
local_model_loader.cc 8KB
notify_worker.cc 7KB
model_loader_base.cc 7KB
predict_thread.cc 7KB
restful_request.cc 7KB
remote_call_model.cc 7KB
graph_impl_stub.cc 6KB
test_model_thread.cc 6KB
worker_context.cc 6KB
distributed_process.cc 6KB
ms_model.cc 5KB
model.cc 5KB
cell.cc 5KB
servable.cc 4KB
http_handle.cc 4KB
servable_py.cc 4KB
stage_function.cc 4KB
test_context.cc 4KB
servable_endpoint.cc 4KB
test_agent_config_acquire.cc 4KB
tensor_base.cc 4KB
grpc_process.cc 3KB
argmax.cc 3KB
log_adapter_common.cc 3KB
stub_inference.cc 3KB
exit_handle.cc 3KB
server.cc 3KB
grpc_server.cc 3KB
notify_agent.cc 3KB
共 484 条
- 1
- 2
- 3
- 4
- 5
资源评论
野生的狒狒
- 粉丝: 2777
- 资源: 2222
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
最新资源
- java将grib2数据转为json格式
- ffmpeg安装软件.rar
- Go语言练习资源 - go-main
- 第十五届蓝桥杯省一代码
- 海信智能电视刷机数据 LED42K330X3D(0000) 生产用软件数据 务必确认机编一致 强制刷机 整机USB升级程序
- shujudaochuceshi
- learn-ruby.zip
- test111111111111111111
- 海信智能电视刷机数据 LED42K326X3D(0011) 生产用软件数据 务必确认机编一致 强制刷机 整机USB升级程序
- 这里有一份针对美国数学建模竞赛(MCM)的案例分析,涵盖了问题描述、建模过程、求解方法、结果分析以及总结反思
资源上传下载、课程学习等过程中有任何疑问或建议,欢迎提出宝贵意见哦~我们会及时处理!
点击此处反馈
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功