Tabnine: Not Your Mother’s Autocomplete #GPT2 #Transformer #Machinelearning @TabNineInc

Tabnine utilizes OpenAIs GPT-2 transformer model to create an AI autocomplete tool that ‘helps you write code faster’. The Deep TabNine beta release was announced last week on Twitter and is now available for download. You can find installation and project details on the Deep TabNine blog post.

Deep TabNine is based on GPT-2, which uses the Transformer network architecture. This architecture was first developed to solve problems in natural language processing. Although modeling code and modeling natural language might appear to be unrelated tasks, modeling code requires understanding English in some unexpected ways. For example, we can make the model negate words with an if/else statement

If you would like to learn more about the GPT-2 transformer model take a look at the publication or try implementing the code!

 



from Adafruit Industries – Makers, hackers, artists, designers and engineers! https://ift.tt/2GDW8Rm
via IFTTT