This past summer, while back at home due to COVID19, I had the fortunate opportunity to work as an applied research intern at Georgian. As the title suggests, my project was on how we can leverage the strong performance of transformers and incorporate tabular data with text data. It was a terrific learning experience. I was able to explore the latest research in transformer models, and understand how our research fits into the company’s broader business goals. I had a lot of freedom as I was able to design and develop my own Python package, adapting the popular HuggingFace Transformers library for tabular data.

To learn more about this project and to try out a demo, feel free to check out the medium article I wrote for Georgian’s research blog.

Close Menu