Aurelien Rodriguez
YOU?
Author Swipe
View article: Llama 2: Open Foundation and Fine-Tuned Chat Models
Llama 2: Open Foundation and Fine-Tuned Chat Models Open
In this work, we develop and release Llama 2, a collection of pretrained and fine-tuned large language models (LLMs) ranging in scale from 7 billion to 70 billion parameters. Our fine-tuned LLMs, called Llama 2-Chat, are optimized for dial…
View article: LLaMA: Open and Efficient Foundation Language Models
LLaMA: Open and Efficient Foundation Language Models Open
We introduce LLaMA, a collection of foundation language models ranging from 7B to 65B parameters. We train our models on trillions of tokens, and show that it is possible to train state-of-the-art models using publicly available datasets e…
View article: Proceedings of the Sixth Workshop on Financial Technology and Natural Language Processing
Proceedings of the Sixth Workshop on Financial Technology and Natural Language Processing Open
Natural language processing (NLP) has recently gained relevance within financial institutions by providing highly valuable insights into companies and markets' financial documents.However, the landscape of the financial domain presents ext…
View article: Weak Arithmetic Cobordism
Weak Arithmetic Cobordism Open
In the early 2000's Levine and Morel have given a geometric construction of an algebraic cobordism group defined for all smooth quasi projective varieties over a field. We show how we can refine their construction to build an Arakelov vers…