Skip to content

Commit

Permalink
reverting openvino_docs_IE_DG_Bfloat16Inference and openvino_docs_IE_…
Browse files Browse the repository at this point in the history
…DG_Int8Inference
  • Loading branch information
myshevts committed Mar 16, 2022
1 parent a77c7e4 commit 895f5d5
Show file tree
Hide file tree
Showing 5 changed files with 15 additions and 5 deletions.
2 changes: 1 addition & 1 deletion docs/OV_Runtime_UG/Bfloat16Inference.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Bfloat16 Inference
# Bfloat16 Inference {#openvino_docs_IE_DG_Bfloat16Inference}

## Bfloat16 Inference Usage (C++)

Expand Down
2 changes: 1 addition & 1 deletion docs/OV_Runtime_UG/Int8Inference.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Low-Precision 8-bit Integer Inference
# Low-Precision 8-bit Integer Inference {#openvino_docs_IE_DG_Int8Inference}

## Disclaimer

Expand Down
10 changes: 10 additions & 0 deletions docs/OV_Runtime_UG/supported_plugins/CPU.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,15 @@
# CPU device {#openvino_docs_OV_UG_supported_plugins_CPU}

@sphinxdirective

.. toctree::
:maxdepth: 1
:hidden:

openvino_docs_IE_DG_Bfloat16Inference

@endsphinxdirective

## Introducing the CPU Plugin
The CPU plugin was developed to achieve high performance of neural networks on CPU, using the Intel® Math Kernel Library for Deep Neural Networks (Intel® MKL-DNN).

Expand Down
2 changes: 1 addition & 1 deletion docs/documentation.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@
openvino_docs_deployment_optimization_guide_latency
openvino_docs_IE_DG_Model_caching_overview
openvino_docs_deployment_optimization_guide_tput
openvino_docs_deployment_optimization_guide_hints
openvino_docs_deployment_optimization_guide_hints
openvino_docs_tuning_utilities
openvino_docs_performance_benchmarks

Expand Down
4 changes: 2 additions & 2 deletions docs/optimization_guide/model_optimization_guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@

pot_README
docs_nncf_introduction
openvino_docs_IE_DG_Int8Inference

@endsphinxdirective

Expand All @@ -31,5 +32,4 @@ POT is the easiest way to get optimized models, and usually takes several minute
![](../img/WHAT_TO_USE.svg)

## See also
- [Deployment optimization](./dldt_deployment_optimization_guide.md)
- [int8 runtime specifics](../OV_Runtime_UG/Int8Inference.md)
- [Deployment optimization](./dldt_deployment_optimization_guide.md)

0 comments on commit 895f5d5

Please sign in to comment.