Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problem running xzlib in L3/demos #124

Open
DontSpillTheTea opened this issue Apr 5, 2022 · 5 comments
Open

Problem running xzlib in L3/demos #124

DontSpillTheTea opened this issue Apr 5, 2022 · 5 comments

Comments

@DontSpillTheTea
Copy link

DontSpillTheTea commented Apr 5, 2022

Greetings,

We're trying to build and run some compression tests with Vitis Library's libzso demo folder.
Using a cloud server with a U50 card, we've managed to build xzlib.
However, when running xzlib, it fails to find the xclbin file like so:

user@as004:/scratch/Vitis_Libraries/data_compression/L3/demos/libzso$ ./build/xzlib -t data/testdata

ZLIB Compression - LIBZSO

ERT slotsize computed to: 1024
gzipOCLHost.cpp:1351 OPENCL API --> m_compressFullKernel = new
cl::Kernel(*m_program, compress_kname.c_str(), &err), RESULT: -->
CL_INVALID_KERNEL_NAME ==

Upon inspecting the source code, there seems to be an error relating to a macro on line 1351 of zlib-1.2.7/gzipOCLHost.cpp>

==
1347     if (m_compressFullKernel == NULL) {
1348         // Random: Use kernel name directly and let XRT
1349         // decide the CU connection based on HBM bank
1350         compress_kname = compress_kernel_names[1];
1351         OCL_CHECK(err, m_compressFullKernel = new cl::Kernel(*m_program, compress_kname.c_str(), &err));
1352         initialize_checksum = true;
1353     }
==

We checked with Xilinx Japan and they gave us these two explanations:

  1. A mismatch between device-side config (stream processing) and expected host code
  2. An error within xclbin generated by device config (block processing)

They think the latest master branch might be corrupted, but couldn't help further as it's beyond their expertise.

More info below:

Build steps and execution

$ make all TARGET=hw DEVICE=xilinx_u50_gen3x16_xdma_201920_3
$ ./build/xzlib -c data/testdata
$ /opt/xilinx/apps/zlib/xclbin/u50_gen3x16_xdma_201920_3.xclbin 
Unable to open binary file

xclbin's location

$ find . -name *.xclbin | xargs ls -l
-rw-r--r-- 1 user user2 54074163  2月24 15:22 
./libzso/build/xclbin_xilinx_u50_gen3x16_xdma_201920_3_hw/compress_decompress.xclbin

Install locations

$ grep -r c_installRootDir
L3/include/zlib.hpp:constexpr auto c_installRootDir = "/opt/xilinx/apps/";
L3/src/zlibFactory.cpp:auto fullXclbin = std::string(c_installRootDir) + c_hardXclbinPath;
L3/src/zlibFactory.cpp:this->m_u50_xclbin = std::string(c_installRootDir) + c_hardXclbinPath;L3/src/zlibFactory.cpp:std::string hpath = std::string(c_installRootDir) + "zlib/scripts/xrt.ini";
L3/src/zlib.cpp:std::ifstream bin_file((std::string(c_installRootDir) + c_hardFullXclbinPath), std::ifstream::binary);
common/libs/compress/gzipOCLHost.hpp:constexpr auto c_installRootDir = "/opt/xilinx/apps/";
common/libs/compress/gzipOCLHost.cpp:std::ifstream bin_file((std::string(c_installRootDir) + c_hardFullXclbinPath), std::ifstream::binary);
@vt-lib-support
Copy link
Collaborator

vt-lib-support commented Apr 6, 2022

@DontSpillTheTea In order to run this test case successfully you need to setup your environment using below mentioned steps,

  1. setenv LD_LIBRARY_PATH ${PWD}:$LD_LIBRARY_PATH
  2. setenv XILINX_LIBZ_XCLBIN [ absolute path to your XCLBIN file ]

@DontSpillTheTea
Copy link
Author

Thank you for the follow-up @vt-lib-support, but we are unfortunately meeting the same issue.

We followed the steps as seen below:

Assigning environment variables

user@as004:/scratch/Vitis_Libraries/data_compression/L3/demos/libzso ⇐ Working directory
$ export LD_LIBRARY_PATH=${PWD}:$LD_LIBRARY_PATH
$ echo $LD_LIBRARY_PATH
/scratch/Vitis_Libraries/data_compression/L3/demos/libzso:
$ export XILINX_LIBZ_XCLBIN=/scratch/Vitis_Libraries/data_compression/L3/demos/libzso/build/xclbin_xilinx_u50_gen3x16_xdma_201920_3_hw/compress_decompress.xclbin
$ echo $XILINX_LIBZ_XCLBIN
/scratch/Vitis_Libraries/data_compression/L3/demos/libzso/build/xclbin_xilinx_u50_gen3x16_xdma_201920_3_hw/compress_decompress.xclbin

Running xzlib

user@as004:/scratch/Vitis_Libraries/data_compression/L3/demos/libzso ⇐ Working directory
$ ./build/xzlib -t data/testdata
ZLIB Compression - LIBZSO

gzipOCLHost.cpp:1351 OPENCL API --> m_compressFullKernel = new cl::Kernel(*m_program, compress_kname.c_str(), &err), RESULT: -->  CL_INVALID_KERNEL_NAME
$ ./build/xzlib -c data/testdata
gzipOCLHost.cpp:1351 OPENCL API --> m_compressFullKernel = new cl::Kernel(*m_program, compress_kname.c_str(), &err), RESULT: -->  CL_INVALID_KERNEL_NAME
$ ./build/xzlib -h
==================================================================
Usage: ./build/xzlib [Options] [Files]

          --help,                  -h        Print Help Options
          --compress,              -c        Compress the specified files
          --decompress,            -d        Decompress the specified files
          --test,                  -t        Compress followed by Decompress to test that decompression produces original files
          --verbosity,             -v        Verbose Mode [0|1]
                Default: [0]
          --no_acceleration,       -n        Do not use Xilinx Alveo, use CPU cores instead. By default Alveo acceleration is used[FPGA=1|CPU=0]     Default: [1]
          --compression_level,     -cl       Compression Level Settings [No Compression=0|Best Speed=1|Best Compression=9|Default=-1]-1
          --chunk_mode,            -cm       Chunk Mode: Input file divided and deflate/inflate called in a loop [0|1]
                Default: [0]
          --c_file_list,           -cfl      Compression List of Files
          --d_file_list,           -dfl      Decompression List of Files
          --file_list,             -l        File List (Compress, Decompress and Validation)
          --num_iter,              -nitr     Number of Iterations
                Default: [1]
          --max_cr,                -mcr      Maximum CR
                Default: [20]
          --mprocess,              -mp       Multi Process [1] or Multi thread [0]
                Default: [1]

If we missed anything please let us know.

@vt-lib-support
Copy link
Collaborator

@vt-lib-support please confirm that you are providing absolute path. Also can you try sw_emu flow of this demo in your environment and confirm all the settings work as expected ?

@DontSpillTheTea
Copy link
Author

Thank you for the reply.

After finding the .xclbin, we set its absolute path to the the only relevant file we could find.

$ find . -name *.xclbin | xargs ls -l
-rw-r--r-- 1 user user2 54074163  2月24 15:22 
./libzso/build/xclbin_xilinx_u50_gen3x16_xdma_201920_3_hw/compress_decompress.xclbin
$ export XILINX_LIBZ_XCLBIN=/scratch/Vitis_Libraries/data_compression/L3/demos/libzso/build/xclbin_xilinx_u50_gen3x16_xdma_201920_3_hw/compress_decompress.xclbin

Are we mistakenly using a wrong path?

Also can you try sw_emu flow of this demo in your environment and confirm all the settings work as expected ?

I'm sorry, could you please elaborate upon what you meant by sw_emu flow? Is it regarding the source code in src/host.cpp?

@vt-lib-support
Copy link
Collaborator

vt-lib-support commented Apr 11, 2022

@DontSpillTheTea We could reproduce the problem and the bug is related to overloaded method which contains invalid kernel name in libz flow (outdated) whereas in normal buffer only flow its correct and works fine. We will soon provide a fix to it. Meanwhile you can explore emulation flows which uses buffer only flow for functionality and compression ratio evaluation.

Here sw_emu means software emulation and hw_emu means hardware emulation.
make run TARGET=sw_emu DEVICE= < absolute path to xpfm file >

Command above runs the libz application in vitis emulation flow for functional verification.

vt-lib-support pushed a commit that referenced this issue Apr 27, 2022
47bdd5f Merge pull request #127 from changg/wa_for_u280
d98ffc1 fix utils.mk
408860d Merge pull request #126 from changg/wa_for_u280
26fa6b4 wa for u280
50b4c4a Merge pull request #125 from changg/fix_metajson
36dba73 revert the file
51966db fix metajson
2c3e837 Merge pull request #124 from changg/round2-mks
1a6d30e fix mks

Co-authored-by: sdausr <sdausr@xilinx.com>
vt-lib-support pushed a commit that referenced this issue Apr 27, 2022
2079b9b Ethash SC prototype (#150)
c08ada1 merge tutorial (#157)
04302db Merge pull request #155 from changg/fix_issues
f792a67 revert the change
98ccdea test
8fa693a fix security issues
f60f3c8 Update Makefile (#152)
9612886 Fix cr 1114888 1114873 (#151)
e622384 update rc4 benchmark case, reduce to 1 kernel (#149)
8b43251 remove email from Jenkinsfile:https://jira.xilinx.com/browse/CR-1124831 (#145)
3d86755 Fix cr 1114880 1114888 (#146)
49d5003 Merge pull request #144 from liyuanz/replace_cflags
43651c8 replace cflags with clflags
ee65b70 Add open file failure guard and mode for read-only (#143)
a05e213 Merge pull request #141 from liyuanz/replace_blacklist
9095130 replace whiltelist/blacklist to allowlist/blocklist
fb7dfd6 Merge pull request #137 from leol/fix-CR
4454482 Merge pull request #138 from liyuanz/next
ded8552 add mem for mem limit case
80dddf9 Fix for L1 CBC SC benchmarks host build issue
dd68b83 Clang format crc32c_sc.cpp with 3.9.0 version
443fa59 Merge pull request #132 from liyuanz/next
34bc727 Merge pull request #134 from liyuanz/replace_targets
be250a9 update targes
3e28567 update
e0cbc3d Refine crc32c to improve the performance of the resdu line (#133)
884b4f9 update
c790abe update
0256aac update
61cb658 update Makefile and utils.mk
359c5b6 Clang format crc32c_sc.cpp for both 3.9.0 and 8.0.0 (#131)
92cf068 SC L3 design of CRC32C (#123)
c00352b update Makefile for Vitis Flow testcase (#128)
33c3c23 remove ssl linkage from library,json (#127)
c67da48 Merge pull request #126 from changg/metadata
080686b draft metadata files
e3efd4d add aws support (#124)
87f65b0 change 2021.2_stable_latest to 2022.1_stable_latest

Co-authored-by: sdausr <sdausr@xilinx.com>
vt-lib-support pushed a commit that referenced this issue May 18, 2023
9286571 Merge pull request #124 from changg/fix_meta
88db601 fix L1 meta
f4d6c6f change 2022.2_stable_latest to 2023.1_stable_latest

Co-authored-by: sdausr <sdausr@xilinx.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants