|
Scooped by
Stéphane Cottin
onto Bonnes pratiques en documentation August 24, 2024 11:16 AM
|
Get Started for FREE
Sign up with Facebook Sign up with X
I don't have a Facebook or a X account
|
Scooped by
Stéphane Cottin
onto Bonnes pratiques en documentation August 24, 2024 11:16 AM
|
|
Bonnes pratiques en documentation
Dernieres informations sur les bonnes pratiques en recherche documentaire, analyse de la documentation, moteurs de recherche,... Curated by Stéphane Cottin |
|
Scooped by
Stéphane Cottin
March 27, 4:23 AM
|
|
Scooped by
Stéphane Cottin
March 26, 7:13 AM
|
The dominant sequence transduction models are based on complex recurrent or convolutional neural networks in an encoder-decoder configuration. The best performing models also connect the encoder and decoder through an attention mechanism. We propose a new simple network architecture, the Transformer, based solely on attention mechanisms, dispensing with recurrence and convolutions entirely. Experiments on two machine translation tasks show these models to be superior in quality while being more parallelizable and requiring significantly less time to train. Our model achieves 28.4 BLEU on the WMT 2014 English-to-German translation task, improving over the existing best results, including ensembles by over 2 BLEU. On the WMT 2014 English-to-French translation task, our model establishes a new single-model state-of-the-art BLEU score of 41.8 after training for 3.5 days on eight GPUs, a small fraction of the training costs of the best models from the literature. We show that the Transformer generalizes well to other tasks by applying it successfully to English constituency parsing both with large and limited training data.
L'article scientifique « L'attention est tout ce dont vous avez besoin » a présenté le Transformer, une nouvelle architecture de réseau neuronal qui facilite la compréhension du langage. Avant le Transformer, les machines n’étaient pas très douées pour comprendre des phrases longues car elles étaient incapables de distinguer les relations entre des mots éloignés les uns des autres. Le Transformer a changé la donne, devenant la pierre angulaire des systèmes de compréhension du langage et d’IA générative les plus impressionnants aujourd’hui. Traduction, synthèse de texte, réponses aux questions et même génération d’images et robotique : le Transformer a révolutionné la manière dont les machines effectuent toutes ces actions.
|
Scooped by
Stéphane Cottin
March 26, 2:58 AM
|
|
Scooped by
Stéphane Cottin
March 17, 1:16 PM
|
Par Antony Belin Note au lecteur : Les article de lois cités ici font référence au contexte juridique français. Différents cas sont possibles concernant la mise-en-place d’un système d’archivage électronique (SAE), qu’il s’agisse de l’Autorité juridique (entité, organisme, collectivité…) propriétaire des données déposées, Autorité d’archivage (AA), ou d’un prestataire tiers-archiveur, Autorité de tiers-archivage (ATA), le tout…
|
Scooped by
Stéphane Cottin
March 14, 2:34 AM
|
This chapter examines the European Union’s Artificial Intelligence Act (AI Act) through the framework of digital constitutionalism. It explores how the AI
|
Scooped by
Stéphane Cottin
March 13, 2:36 PM
|
The ontology engineering process is complex, time-consuming, and error-prone, even for experienced ontology engineers. In this work, we investigate the potential of Large Language Models (LLMs) to provide effective OWL ontology drafts directly from ontological requirements described using user stories and competency questions. Our main contribution is the presentation and evaluation of two new prompting techniques for automated ontology development: Memoryless CQbyCQ and Ontogenia. We also emphasize the importance of three structural criteria for ontology assessment, alongside expert qualitative evaluation, highlighting the need for a multi-dimensional evaluation in order to capture the quality and usability of the generated ontologies. Our experiments, conducted on a benchmark dataset of ten ontologies with 100 distinct CQs and 29 different user stories, compare the performance of three LLMs using the two prompting techniques. The results demonstrate improvements over the current state-of-the-art in LLM-supported ontology engineering. More specifically, the model OpenAI o1-preview with Ontogenia produces ontologies of sufficient quality to meet the requirements of ontology engineers, significantly outperforming novice ontology engineers in modelling ability. However, we still note some common mistakes and variability of result quality, which is important to take into account when using LLMs for ontology authoring support. We discuss these limitations and propose directions for future research.
|
Scooped by
Stéphane Cottin
March 12, 4:25 PM
|
Economics Job Market Rumors (EJMR) is an online forum and clearinghouse for information on the academic job market for economists. It also includes content that
|
Scooped by
Stéphane Cottin
March 12, 3:40 AM
|
|
Scooped by
Stéphane Cottin
March 11, 3:52 AM
|
|
Scooped by
Stéphane Cottin
March 6, 11:23 AM
|
Cram, Lawrence and Docampo, Domingo and Safón, Vicente, Screening articles by citation reputation (July 22, 2024). Quantitative Science Studies, 0[10.1162/qss_a_00355], Available at SSRN: https://ssrn.com/abstract=5129969
|
Scooped by
Stéphane Cottin
February 26, 3:22 AM
|
|
Scooped by
Stéphane Cottin
February 25, 2:43 AM
|
|
Scooped by
Stéphane Cottin
February 16, 2:21 PM
|
Depuis le début du siècle, de nouveaux modèles de diffusion de la recherche scientifique sont apparus. Marchandisant à leur profit l’injonction à l’open access, ils piègent les chercheurs mal informés ou souhaitant répondre plus facilement aux demandes de leurs institutions. Pour éviter la dégradation de la qualité scientifique provoquée par ces mégarevues, il est urgent de repenser les modalités d’expertise et d’évaluation de la recherche.
|
Scooped by
Stéphane Cottin
February 9, 1:23 PM
|
Par Florence Maraninchi L’année 2025 est déjà particulièrement féconde en nouvelles plus fracassantes les unes que les autres sur les financements, la course aux armements entre la Chine et les USA, le sommet intergalactique sur l’IA à Paris, et les … Continuer la lecture →
|
Scooped by
Stéphane Cottin
February 6, 3:18 AM
|
|
Scooped by
Stéphane Cottin
February 6, 2:26 AM
|
Parsons, Patrick and Niedringhaus, Kristina L. and Zhang, Alex, Artificial Intelligence & the Future of Law Libraries (December 01, 2024). Georgia State University College of Law, Legal Studies Research Paper Forthcoming, Available at SSRN: https://ssrn.com/abstract=5118446 or http://dx.doi.org/10.2139/ssrn.5118446
|
Scooped by
Stéphane Cottin
February 4, 3:57 AM
|
Le règlement européen sur l'intelligence artificielle (IA) du 13 juin 2024 est paru au Journal officiel de l'Union européenne du 12 juillet. Le Conseil de l'Europe a adopté le 17 mai 2024 un traité international visant à garantir une IA respectueuse des droits fondamentaux. Retour sur l'IA dans l'UE en sept questions avec Vie-publique. |
|
Scooped by
Stéphane Cottin
January 29, 2:21 AM
|
The creative capacities of generative artificial intelligence (AI) systems can be attributed to an extensive training of the underlying models. This training ut
|
Scooped by
Stéphane Cottin
January 29, 2:19 AM
|
Purpose: This review evaluates the Citationchaser tool as a research support application for systematic reviews, specifically focusing on its functionali
|
Scooped by
Stéphane Cottin
January 28, 2:55 PM
|
|
Scooped by
Stéphane Cottin
January 28, 2:47 PM
|
|
Scooped by
Stéphane Cottin
January 21, 5:50 PM
|