Skip to content

Commit

Permalink
Reorder side bar
Browse files Browse the repository at this point in the history
  • Loading branch information
regisss committed Jan 4, 2024
1 parent 3c9c914 commit 366b2e2
Show file tree
Hide file tree
Showing 4 changed files with 42 additions and 14 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/build_main_documentation.yml
Original file line number Diff line number Diff line change
Expand Up @@ -142,7 +142,7 @@ jobs:
- name: Combine subpackage documentation
run: |
cd optimum
sudo python docs/combine_docs.py --subpackages graphcore habana intel neuron furiosa amd --version ${{ env.VERSION }}
sudo python docs/combine_docs.py --subpackages nvidia amd intel neuron habana furiosa graphcore --version ${{ env.VERSION }}
cd ..
- name: Push to repositories
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/build_pr_documentation.yml
Original file line number Diff line number Diff line change
Expand Up @@ -128,7 +128,7 @@ jobs:
- name: Combine subpackage documentation
run: |
cd optimum
sudo python docs/combine_docs.py --subpackages graphcore habana intel neuron furiosa --version pr_$PR_NUMBER
sudo python docs/combine_docs.py --subpackages nvidia amd intel neuron habana furiosa graphcore --version pr_$PR_NUMBER
sudo mv optimum-doc-build ../
cd ..
Expand Down
28 changes: 28 additions & 0 deletions docs/combine_docs.py
Original file line number Diff line number Diff line change
Expand Up @@ -105,6 +105,31 @@ def add_neuron_doc(base_toc: List):
)


def add_nvidia_doc(base_toc: List):
"""
Extends the table of content with a section about Optimum Neuron.
Args:
base_toc (List): table of content for the doc of Optimum.
"""
# Update optimum table of contents
base_toc.insert(
1,
{
"sections": [
{
# Ideally this should directly point at https://huggingface.co/docs/optimum-neuron/index
# Current hacky solution is to have a redirection in _redirects.yml
"local": "https://github.com/huggingface/optimum-nvidia",
"title": "🤗 Optimum Nvidia",
}
],
"title": "Nvidia",
"isExpanded": False,
},
)


def main():
args = parser.parse_args()
optimum_path = Path("optimum-doc-build")
Expand All @@ -118,6 +143,9 @@ def main():
if subpackage == "neuron":
# Neuron has its own doc so it is managed differently
add_neuron_doc(base_toc)
elif subpackage == "nvidia":
# At the moment, Optimum Nvidia's doc is the README of the GitHub repo
add_nvidia_doc(base_toc)
else:
subpackage_path = Path(f"{subpackage}-doc-build")

Expand Down
24 changes: 12 additions & 12 deletions docs/source/_toctree.yml
Original file line number Diff line number Diff line change
Expand Up @@ -95,6 +95,18 @@
title: "TFLite"
title: Exporters
isExpanded: false
- sections:
- local: bettertransformer/overview
title: Overview
- sections:
- local: bettertransformer/tutorials/convert
title: Convert Transformers models to use BetterTransformer
- local: bettertransformer/tutorials/contribute
title: How to add support for new architectures?
title: Tutorials
isExpanded: false
title: BetterTransformer
isExpanded: false
- sections:
- local: torch_fx/overview
title: Overview
Expand All @@ -115,18 +127,6 @@
isExpanded: false
title: Torch FX
isExpanded: false
- sections:
- local: bettertransformer/overview
title: Overview
- sections:
- local: bettertransformer/tutorials/convert
title: Convert Transformers models to use BetterTransformer
- local: bettertransformer/tutorials/contribute
title: How to add support for new architectures?
title: Tutorials
isExpanded: false
title: BetterTransformer
isExpanded: false
- sections:
- local: llm_quantization/usage_guides/quantization
title: GPTQ quantization
Expand Down

0 comments on commit 366b2e2

Please sign in to comment.