Skip to content

xcw-1010/HLSPilot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

HLSPilot

HLSPilot is the first automatic HLS code generation and optimization framework from sequential C/C++ code using LLM.

This framework investigates the use of LLM for HLS design strategy learning and tool learning, and build a complete hardware acceleration workflow ranging from runtime profiling, kernel identification, automatic HLS code generation, design space exploration, and HW/SW co-design on a hybrid CPU-FPGA computing architecture.

Prerequisites

This project was tested on Xilinx Alveo U280 and Vitis HLS 2019.1 suite. The following python libraries are required:

  • openai (v1.33.0)
  • langchain (v0.2.3)

Usage

  1. Place your code under benchmark directory

  2. Run Code Analysis: This step will call the gprof tool to profile the code and extract performance bottlenecks, thereby helping to complete the HW-SW partition.

    python3 src/hw_sw_partition/auto_analysis.py
  3. Task pipeline: This step will split the kernel into multiple subtasks, allowing for pipeline execution between subtasks and improving program parallelism.

    # update code_file_path in the script
    python3 src/task_pipeline/task_pipeline.py
  4. Task optimization: This step will optimize the subtask code by retrieving suitable pragma from the optimization strategy knowledge base and applying it to the code.

    # update code_file_path in the script
    python3 src/task_opt/task_opt.py
  5. DSE: This step will call the DSE tool to complete the parameter tuning of pragma

    cd src/auto_dse
    python3 aoto_dse.py
    ./exec.sh run APP_NAME .cpp

Citation

If you find this work useful in your research, please consider citing our work:

@article{xiong2024hlspilot,
  title={HLSPilot: LLM-based High-Level Synthesis},
  author={Xiong, Chenwei and Liu, Cheng and Li, Huawei and Li, Xiaowei},
  journal={arXiv preprint arXiv:2408.06810},
  year={2024}
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published