Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update C_API.md #29

Merged
merged 1 commit into from
Nov 27, 2018
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
28 changes: 13 additions & 15 deletions docs/C_API.md
Original file line number Diff line number Diff line change
@@ -1,17 +1,17 @@
# C API

# Q: Why having a C API?
Q: Why not just live in C++ world? Why must C?
A: We want to distribute onnxruntime as a DLL, which can be used in .Net languages through [P/Invoke](https://docs.microsoft.com/en-us/cpp/dotnet/how-to-call-native-dlls-from-managed-code-using-pinvoke).
Then this is the only option we have.
# Q: Why have a C API?
Q: Why not just live in a C++ world? Why C?
A: We want to distribute the onnxruntime as a DLL, which can be used in .Net languages through [P/Invoke](https://docs.microsoft.com/en-us/cpp/dotnet/how-to-call-native-dlls-from-managed-code-using-pinvoke).
This is the only option we have.

Q: Is it only for .Net?
A: No. It is designed for
1. Creating language bindings for onnxruntime.e.g. C#, python, java, ...
2. Dynamic linking always has some benefits. For example, for solving diamond dependency problem.
A: No. It is designed for:
1. Creating language bindings for the onnxruntime. e.g. C#, python, java, ...
2. Dynamic linking has some benefits. For example, solving diamond dependency problems.

Q: Can I export C++ types and functions across DLL or "Shared Object" Library(.so) boundaries?
A: Well, you can, but it's not a good practice. And we won't do it in this project.
A: Well, you can, but it's not a good practice. We won't do it in this project.


## What's inside
Expand All @@ -26,14 +26,12 @@ A: Well, you can, but it's not a good practice. And we won't do it in this proje

## How to use it

Include [onnxruntime_c_api.h](include/onnxruntime/core/session/onnxruntime_c_api.h) in your source code.

Then,
1. Call ONNXRuntimeInitialize
2. Create Session: ONNXRuntimeCreateInferenceSession(env, model_uri, nullptr,...)
3. Create Tensor
1. Include [onnxruntime_c_api.h](include/onnxruntime/core/session/onnxruntime_c_api.h).
2. Call ONNXRuntimeInitialize
3. Create Session: ONNXRuntimeCreateInferenceSession(env, model_uri, nullptr,...)
4. Create Tensor
1) ONNXRuntimeCreateAllocatorInfo
2) ONNXRuntimeCreateTensorWithDataAsONNXValue
4. ONNXRuntimeRunInference
5. ONNXRuntimeRunInference