-
Notifications
You must be signed in to change notification settings - Fork 391
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Google API limit ? #370
Comments
google is actually far more generous than what their public API limit states. i'll give u a workaround hint - 6 versus 4. |
Hi guys, Or is there any possibility to add parameters to the GET request which is created by translation shell? Regards |
It started happening today and I barely used it at all. So I guess it's dev's API which is rate limited? |
Dev API limit has always been there. They have a more lax limit when not using OAuth tokens.
I’ve made custom bash script and curl query that essentially mimics what the translate shell does but only for google, getting the full JSON API dump back, and benchmarked it to a hard limit of roughly 4500 bytes per 6.5-7.0 seconds (Unicode counted in bytes not chars).
Say UTF8 codepoint of U + 941A.
URL encoded becomes %E9 %90 %9A
This would be counted as 3 bytes in that limit, not 1 or 9.
Remember that each new line \ n is one byte too.
And finally, remember that u, in ipv4 and ipv6, aren’t the same person ^^
… PerseusArkouda ***@***.***> 於 2021年1月22日 13:55 寫道:
[ERROR] Google did not return results because rate limiting is in effect
[ERROR] Rate limiting
It started happening today and I barely used it at all. So I guess it's dev's API which is rate limited?
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub, or unsubscribe.
|
@mogando668 So can you share the script? |
Were you able to pass an API key? |
lemme collate that code for u
but I don’t have any oauth2 api key to piggyback upon. The script doesn’t need any of that - and hardly any strange custom http headers. No vpn no obscuring via proxy or smart DNS no back door no side door. Just straight up curl to their public API, the Same URL accessed by translate shell
the main “trick”, which isn’t really one , lies upon the fact that my home ISP gives me a IPv4 and IPv6 address simultaneously, but Google’s servers track their querying frequency and volume as if those were 2 completely independent systems, thus effectively doubling the rate without any special tricks.
I’ll detail them in my email later. you’re free to do whatever u want with that code. it’s not under any sort of license, nor would I ever care about that. I simply believe in sharing.
Regards,
Jason K
… xavierbaez ***@***.***> 於 2021年5月30日 02:29 寫道:
Hi guys,
is there any possibility to optionally give an Google API Key in translation shell? Like in any configuration file or via command line option?
Or is there any possibility to add parameters to the GET request which is created by translation shell?
Regards
Ben
Were you able to pass an API key?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub, or unsubscribe.
|
I'm wondering this, too. |
so sorry for the ultra long delay. my code codes pieces are still a mess. hope you can make any sense out of it. reply here if it's confusing. tsS squeeze spaces. tsL trims end of lines. uniqPB is an alias using mawk to do only get uniq non-empty lines without sorting. The way it's written now, it looks for the most recent flat text file in the current folder named . smallerM3T_*.txt. Because i use this one liner utility via gnu-parallel to control the jobs. The small T files are 3 columns, using equal sign "=" ( \ 075 | x 3D ) are a delimiter. Column one is fixed width alpha-numeric indices - made, column 2 isn't need here , but just a 1-letter language code for E / C / J / K / O(thers). Column 3 will be the text that needs to be translated. What's already residing inside the need*.txt are literally rows and rows and alpha-numeric indices, using pipe as delimiter. then each time parallel runs this shell function it'll grep those indices along column 1 , take what's indeded to be translated, and safe URL-quote-plus style encodes it before sending it over. I try to limit them to 4500-pre-url-encoded BYTES per query. Right now the thing is overly batch-optimized but u can see a lot of overhead if you just wanna translate a few lines. I needed to translate 12.5 million lines, 3 langs each translated, for over 37,000,000, so batch was the only way viable at all. U can see how the code doesn't have anywhere for API key cuz simply none was needed.
And all dumped into a shared text output file ike
enjoy ! |
Is it possible we pass our own API key?
|
Hi,
I'm surprised that Google won't let me translate more than 6 files... is this normal? Does it happen to anyone else?
It seems reasonable to me that Google limit the API, someone can guide me, please?
Of course, it is not a problem of Scan Tailor, but I can't find where to study the subject. The best thing I can do is to make use of my VPN (Thanks NordVPN !)
This is the command I am using
trans :es file://IMG_20200911_215210.txt -o IMG_20200911_215210_ES.txt
Best,
Martin
Translate Shell 0.9.6.12
platform Darwin
terminal type xterm-256color
bi-di emulator [N/A]
gawk (GNU Awk) 5.1.0
fribidi (GNU FriBidi) 1.0.10
audio player [NOT INSTALLED]
terminal pager less
web browser open
user locale en_US.UTF-8 (English)
home language en
source language auto
target language en
translation engine google
proxy [NONE]
user-agent Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/81.0.4044.138 Safari/537.36
ip version [DEFAULT]
theme default
init file [NONE]
The text was updated successfully, but these errors were encountered: