Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ggml : remove ggml_flash_attn and ggml_flash_ff #7463

Merged
merged 1 commit into from
May 23, 2024

Conversation

ggerganov
Copy link
Owner

@ggerganov ggerganov commented May 22, 2024

These 2 ops were used in whisper.cpp in the past, but with the new FA implementation they are no longer necessary. I doubt that any other project is using them, so no need to go through deprecation phase

Regarding the training examples - for now we stop using FA. The longer term plan is to start using the new ggml_flash_attn_ext() and adapt ggml_flash_attn_back() to it, but this is low-priority for now

@github-actions github-actions bot added testing Everything test related examples labels May 22, 2024
Copy link
Contributor

📈 llama.cpp server for bench-server-baseline on Standard_NC4as_T4_v3 for phi-2-q4_0: 554 iterations 🚀

Expand details for performance related PR only
  • Concurrent users: 8, duration: 10m
  • HTTP request : avg=8412.43ms p(95)=20245.58ms fails=, finish reason: stop=502 truncated=52
  • Prompt processing (pp): avg=95.76tk/s p(95)=397.1tk/s
  • Token generation (tg): avg=32.87tk/s p(95)=48.87tk/s
  • ggml-org/models/phi-2/ggml-model-q4_0.gguf parallel=8 ctx-size=16384 ngl=33 batch-size=2048 ubatch-size=256 pp=1024 pp+tg=2048 branch=gg/ggml-remove-old-flash commit=dfdeb8aaf51f1e63e8d122acebb3925fa3a60af6

prompt_tokens_seconds

More
---
config:
    xyChart:
        titleFontSize: 12
        width: 900
        height: 600
    themeVariables:
        xyChart:
            titleColor: "#000000"
---
xychart-beta
    title "llama.cpp bench-server-baseline on Standard_NC4as_T4_v3
 duration=10m 554 iterations"
    y-axis "llamacpp:prompt_tokens_seconds"
    x-axis "llamacpp:prompt_tokens_seconds" 1716387010 --> 1716387636
    line [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 225.56, 225.56, 225.56, 225.56, 225.56, 609.13, 609.13, 609.13, 609.13, 609.13, 621.54, 621.54, 621.54, 621.54, 621.54, 654.58, 654.58, 654.58, 654.58, 654.58, 727.34, 727.34, 727.34, 727.34, 727.34, 728.7, 728.7, 728.7, 728.7, 728.7, 719.62, 719.62, 719.62, 719.62, 719.62, 747.5, 747.5, 747.5, 747.5, 747.5, 759.91, 759.91, 759.91, 759.91, 759.91, 779.33, 779.33, 779.33, 779.33, 779.33, 803.1, 803.1, 803.1, 803.1, 803.1, 837.48, 837.48, 837.48, 837.48, 837.48, 853.02, 853.02, 853.02, 853.02, 853.02, 882.19, 882.19, 882.19, 882.19, 882.19, 887.86, 887.86, 887.86, 887.86, 887.86, 888.53, 888.53, 888.53, 888.53, 888.53, 890.54, 890.54, 890.54, 890.54, 890.54, 885.38, 885.38, 885.38, 885.38, 885.38, 905.9, 905.9, 905.9, 905.9, 905.9, 901.52, 901.52, 901.52, 901.52, 901.52, 897.13, 897.13, 897.13, 897.13, 897.13, 903.49, 903.49, 903.49, 903.49, 903.49, 907.71, 907.71, 907.71, 907.71, 907.71, 917.37, 917.37, 917.37, 917.37, 917.37, 916.42, 916.42, 916.42, 916.42, 916.42, 925.72, 925.72, 925.72, 925.72, 925.72, 927.56, 927.56, 927.56, 927.56, 927.56, 924.71, 924.71, 924.71, 924.71, 924.71, 924.02, 924.02, 924.02, 924.02, 924.02, 927.14, 927.14, 927.14, 927.14, 927.14, 925.06, 925.06, 925.06, 925.06, 925.06, 924.92, 924.92, 924.92, 924.92, 924.92, 925.84, 925.84, 925.84, 925.84, 925.84, 916.47, 916.47, 916.47, 916.47, 916.47, 917.58, 917.58, 917.58, 917.58, 917.58, 918.55, 918.55, 918.55, 918.55, 918.55, 909.17, 909.17, 909.17, 909.17, 909.17, 908.33, 908.33, 908.33, 908.33, 908.33, 909.76, 909.76, 909.76, 909.76, 909.76, 906.93, 906.93, 906.93, 906.93, 906.93, 906.41, 906.41, 906.41, 906.41, 906.41, 878.69, 878.69, 878.69, 878.69, 878.69, 864.84, 864.84, 864.84, 864.84, 864.84, 863.75, 863.75, 863.75, 863.75, 863.75, 862.08, 862.08, 862.08, 862.08, 862.08, 860.99, 860.99, 860.99, 860.99, 860.99, 866.41, 866.41, 866.41, 866.41, 866.41, 865.15, 865.15, 865.15, 865.15, 865.15, 867.08, 867.08, 867.08, 867.08, 867.08, 867.64, 867.64, 867.64, 867.64, 867.64, 870.62, 870.62, 870.62, 870.62, 870.62, 871.85, 871.85, 871.85, 871.85, 871.85, 871.72, 871.72, 871.72, 871.72, 871.72, 877.05, 877.05, 877.05, 877.05, 877.05, 877.41, 877.41, 877.41, 877.41, 877.41, 876.98, 876.98, 876.98, 876.98, 876.98, 877.33, 877.33, 877.33, 877.33, 877.33, 877.35, 877.35, 877.35, 877.35, 877.35, 874.34, 874.34, 874.34, 874.34, 874.34, 876.55, 876.55, 876.55, 876.55, 876.55, 876.58, 876.58, 876.58, 876.58]
                    
Loading
predicted_tokens_seconds
More
---
config:
    xyChart:
        titleFontSize: 12
        width: 900
        height: 600
    themeVariables:
        xyChart:
            titleColor: "#000000"
---
xychart-beta
    title "llama.cpp bench-server-baseline on Standard_NC4as_T4_v3
 duration=10m 554 iterations"
    y-axis "llamacpp:predicted_tokens_seconds"
    x-axis "llamacpp:predicted_tokens_seconds" 1716387010 --> 1716387636
    line [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 42.21, 42.21, 42.21, 42.21, 42.21, 30.45, 30.45, 30.45, 30.45, 30.45, 33.16, 33.16, 33.16, 33.16, 33.16, 34.37, 34.37, 34.37, 34.37, 34.37, 34.2, 34.2, 34.2, 34.2, 34.2, 33.83, 33.83, 33.83, 33.83, 33.83, 34.64, 34.64, 34.64, 34.64, 34.64, 35.1, 35.1, 35.1, 35.1, 35.1, 35.34, 35.34, 35.34, 35.34, 35.34, 35.15, 35.15, 35.15, 35.15, 35.15, 34.61, 34.61, 34.61, 34.61, 34.61, 33.85, 33.85, 33.85, 33.85, 33.85, 33.61, 33.61, 33.61, 33.61, 33.61, 32.65, 32.65, 32.65, 32.65, 32.65, 30.97, 30.97, 30.97, 30.97, 30.97, 30.01, 30.01, 30.01, 30.01, 30.01, 30.35, 30.35, 30.35, 30.35, 30.35, 30.14, 30.14, 30.14, 30.14, 30.14, 29.89, 29.89, 29.89, 29.89, 29.89, 29.12, 29.12, 29.12, 29.12, 29.12, 29.33, 29.33, 29.33, 29.33, 29.33, 29.63, 29.63, 29.63, 29.63, 29.63, 29.67, 29.67, 29.67, 29.67, 29.67, 29.64, 29.64, 29.64, 29.64, 29.64, 29.9, 29.9, 29.9, 29.9, 29.9, 29.87, 29.87, 29.87, 29.87, 29.87, 30.16, 30.16, 30.16, 30.16, 30.16, 30.34, 30.34, 30.34, 30.34, 30.34, 30.62, 30.62, 30.62, 30.62, 30.62, 30.71, 30.71, 30.71, 30.71, 30.71, 30.85, 30.85, 30.85, 30.85, 30.85, 31.06, 31.06, 31.06, 31.06, 31.06, 31.07, 31.07, 31.07, 31.07, 31.07, 30.92, 30.92, 30.92, 30.92, 30.92, 30.45, 30.45, 30.45, 30.45, 30.45, 30.12, 30.12, 30.12, 30.12, 30.12, 30.3, 30.3, 30.3, 30.3, 30.3, 30.48, 30.48, 30.48, 30.48, 30.48, 30.63, 30.63, 30.63, 30.63, 30.63, 30.76, 30.76, 30.76, 30.76, 30.76, 30.82, 30.82, 30.82, 30.82, 30.82, 30.52, 30.52, 30.52, 30.52, 30.52, 30.23, 30.23, 30.23, 30.23, 30.23, 29.88, 29.88, 29.88, 29.88, 29.88, 28.95, 28.95, 28.95, 28.95, 28.95, 28.94, 28.94, 28.94, 28.94, 28.94, 28.96, 28.96, 28.96, 28.96, 28.96, 28.99, 28.99, 28.99, 28.99, 28.99, 28.99, 28.99, 28.99, 28.99, 28.99, 29.13, 29.13, 29.13, 29.13, 29.13, 29.13, 29.13, 29.13, 29.13, 29.13, 29.07, 29.07, 29.07, 29.07, 29.07, 29.04, 29.04, 29.04, 29.04, 29.04, 29.03, 29.03, 29.03, 29.03, 29.03, 29.07, 29.07, 29.07, 29.07, 29.07, 29.25, 29.25, 29.25, 29.25, 29.25, 29.35, 29.35, 29.35, 29.35, 29.35, 29.39, 29.39, 29.39, 29.39, 29.39, 29.45, 29.45, 29.45, 29.45, 29.45, 29.45, 29.45, 29.45, 29.45]
                    
Loading

Details

kv_cache_usage_ratio

More
---
config:
    xyChart:
        titleFontSize: 12
        width: 900
        height: 600
    themeVariables:
        xyChart:
            titleColor: "#000000"
---
xychart-beta
    title "llama.cpp bench-server-baseline on Standard_NC4as_T4_v3
 duration=10m 554 iterations"
    y-axis "llamacpp:kv_cache_usage_ratio"
    x-axis "llamacpp:kv_cache_usage_ratio" 1716387010 --> 1716387636
    line [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.06, 0.06, 0.06, 0.06, 0.06, 0.24, 0.24, 0.24, 0.24, 0.24, 0.17, 0.17, 0.17, 0.17, 0.17, 0.18, 0.18, 0.18, 0.18, 0.18, 0.19, 0.19, 0.19, 0.19, 0.19, 0.22, 0.22, 0.22, 0.22, 0.22, 0.14, 0.14, 0.14, 0.14, 0.14, 0.13, 0.13, 0.13, 0.13, 0.13, 0.09, 0.09, 0.09, 0.09, 0.09, 0.19, 0.19, 0.19, 0.19, 0.19, 0.22, 0.22, 0.22, 0.22, 0.22, 0.26, 0.26, 0.26, 0.26, 0.26, 0.19, 0.19, 0.19, 0.19, 0.19, 0.37, 0.37, 0.37, 0.37, 0.37, 0.38, 0.38, 0.38, 0.38, 0.38, 0.37, 0.37, 0.37, 0.37, 0.37, 0.17, 0.17, 0.17, 0.17, 0.17, 0.1, 0.1, 0.1, 0.1, 0.1, 0.34, 0.34, 0.34, 0.34, 0.34, 0.36, 0.36, 0.36, 0.36, 0.36, 0.1, 0.1, 0.1, 0.1, 0.1, 0.11, 0.11, 0.11, 0.11, 0.11, 0.18, 0.18, 0.18, 0.18, 0.18, 0.22, 0.22, 0.22, 0.22, 0.22, 0.11, 0.11, 0.11, 0.11, 0.11, 0.2, 0.2, 0.2, 0.2, 0.2, 0.12, 0.12, 0.12, 0.12, 0.12, 0.08, 0.08, 0.08, 0.08, 0.08, 0.14, 0.14, 0.14, 0.14, 0.14, 0.16, 0.16, 0.16, 0.16, 0.16, 0.14, 0.14, 0.14, 0.14, 0.14, 0.12, 0.12, 0.12, 0.12, 0.12, 0.13, 0.13, 0.13, 0.13, 0.13, 0.27, 0.27, 0.27, 0.27, 0.27, 0.33, 0.33, 0.33, 0.33, 0.33, 0.52, 0.52, 0.52, 0.52, 0.52, 0.09, 0.09, 0.09, 0.09, 0.09, 0.16, 0.16, 0.16, 0.16, 0.16, 0.16, 0.16, 0.16, 0.16, 0.16, 0.09, 0.09, 0.09, 0.09, 0.09, 0.19, 0.19, 0.19, 0.19, 0.19, 0.5, 0.5, 0.5, 0.5, 0.5, 0.55, 0.55, 0.55, 0.55, 0.55, 0.49, 0.49, 0.49, 0.49, 0.49, 0.47, 0.47, 0.47, 0.47, 0.47, 0.11, 0.11, 0.11, 0.11, 0.11, 0.22, 0.22, 0.22, 0.22, 0.22, 0.21, 0.21, 0.21, 0.21, 0.21, 0.19, 0.19, 0.19, 0.19, 0.19, 0.13, 0.13, 0.13, 0.13, 0.13, 0.22, 0.22, 0.22, 0.22, 0.22, 0.29, 0.29, 0.29, 0.29, 0.29, 0.18, 0.18, 0.18, 0.18, 0.18, 0.26, 0.26, 0.26, 0.26, 0.26, 0.1, 0.1, 0.1, 0.1, 0.1, 0.12, 0.12, 0.12, 0.12, 0.12, 0.11, 0.11, 0.11, 0.11, 0.11, 0.16, 0.16, 0.16, 0.16, 0.16, 0.2, 0.2, 0.2, 0.2, 0.2, 0.23, 0.23, 0.23, 0.23, 0.23, 0.11, 0.11, 0.11, 0.11]
                    
Loading
requests_processing
More
---
config:
    xyChart:
        titleFontSize: 12
        width: 900
        height: 600
    themeVariables:
        xyChart:
            titleColor: "#000000"
---
xychart-beta
    title "llama.cpp bench-server-baseline on Standard_NC4as_T4_v3
 duration=10m 554 iterations"
    y-axis "llamacpp:requests_processing"
    x-axis "llamacpp:requests_processing" 1716387010 --> 1716387636
    line [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 3.0, 3.0, 3.0, 3.0, 3.0, 2.0, 2.0, 2.0, 2.0, 2.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 4.0, 4.0, 4.0, 4.0, 4.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 5.0, 5.0, 5.0, 5.0, 5.0, 3.0, 3.0, 3.0, 3.0, 3.0, 5.0, 5.0, 5.0, 5.0, 5.0, 6.0, 6.0, 6.0, 6.0, 6.0, 7.0, 7.0, 7.0, 7.0, 7.0, 4.0, 4.0, 4.0, 4.0, 4.0, 3.0, 3.0, 3.0, 3.0, 3.0, 4.0, 4.0, 4.0, 4.0, 4.0, 7.0, 7.0, 7.0, 7.0, 7.0, 3.0, 3.0, 3.0, 3.0, 3.0, 6.0, 6.0, 6.0, 6.0, 6.0, 4.0, 4.0, 4.0, 4.0, 4.0, 2.0, 2.0, 2.0, 2.0, 2.0, 5.0, 5.0, 5.0, 5.0, 5.0, 4.0, 4.0, 4.0, 4.0, 4.0, 7.0, 7.0, 7.0, 7.0, 7.0, 8.0, 8.0, 8.0, 8.0, 8.0, 2.0, 2.0, 2.0, 2.0, 2.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 5.0, 5.0, 5.0, 5.0, 5.0, 2.0, 2.0, 2.0, 2.0, 2.0, 4.0, 4.0, 4.0, 4.0, 4.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 3.0, 3.0, 3.0, 3.0, 3.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 8.0, 8.0, 8.0, 8.0, 8.0, 6.0, 6.0, 6.0, 6.0, 6.0, 7.0, 7.0, 7.0, 7.0, 7.0, 6.0, 6.0, 6.0, 6.0, 6.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 4.0, 4.0, 4.0, 4.0, 4.0, 5.0, 5.0, 5.0, 5.0, 5.0, 4.0, 4.0, 4.0, 4.0, 4.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 4.0, 4.0, 4.0, 4.0, 4.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0]
                    
Loading

@mofosyne mofosyne added the Review Complexity : Medium Generally require more time to grok but manageable by beginner to medium expertise level label May 22, 2024
@ggerganov ggerganov merged commit d48c88c into master May 23, 2024
73 of 84 checks passed
@ggerganov ggerganov deleted the gg/ggml-remove-old-flash branch May 23, 2024 07:00
teleprint-me pushed a commit to teleprint-me/llama.cpp that referenced this pull request May 23, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
examples Review Complexity : Medium Generally require more time to grok but manageable by beginner to medium expertise level testing Everything test related
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants