-
Notifications
You must be signed in to change notification settings - Fork 1
/
Copy pathhelpful-resources
36 lines (29 loc) · 1.66 KB
/
helpful-resources
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
Helpful links below:
Kaggle:
------------
https://www.kaggle.com/murats/word2vec-application-on-turkish-newspaper/execution
https://www.kaggle.com/guichristmann/lstm-classification-model-with-word2vec#Loading,-cleaning-and-formatting-data
Medium
------------
https://medium.com/@azercelikten/rnn-recurrent-neural-network-nedir-ve-nas%C4%B1l-%C3%A7al%C4%B1%C5%9F%C4%B1r-d1246b1a61fb
https://medium.com/deep-learning-turkiye/rnn-nedir-nas%C4%B1l-%C3%A7al%C4%B1%C5%9F%C4%B1r-9e5d572689e1
Youtube
-----------
https://www.youtube.com/watch?v=LfnrRPFhkuY&ab_channel=codebasics
Blogs
----------
https://explosion.ai/blog/deep-learning-formula-nlp
https://stackoverflow.com/questions/42064690/using-pre-trained-word2vec-with-lstm-for-word-generation
https://blog.floydhub.com/long-short-term-memory-from-zero-to-hero-with-pytorch/
https://towardsdatascience.com/text-summarization-from-scratch-using-encoder-decoder-network-with-attention-in-keras-5fa80d12710e
https://medium.com/@tuncerergin/keras-ile-derin-ogrenme-modeli-olusturma-4b4ffdc35323
https://medium.com/analytics-vidhya/seq2seq-abstractive-summarization-using-lstm-and-attention-mechanism-code-da2e9c439711
https://github.com/aravindpai/How-to-build-own-text-summarizer-using-deep-learning/blob/master/How_to_build_own_text_summarizer_using_deep_learning.ipynb
Git
---------
https://github.com/singularity014/Keyword-Extraction-Bidirectional-LSTM
https://github.com/basaldella/deepkeyphraseextraction
https://gist.github.com/maxim5/c35ef2238ae708ccb0e55624e9e0252b
https://github.com/aravindpai/How-to-build-own-text-summarizer-using-deep-learning/blob/master/How_to_build_own_text_summarizer_using_deep_learning.ipynb
Articles
------------