BEGIN:VCALENDAR
VERSION:2.0
PRODID:Linklings LLC
BEGIN:VTIMEZONE
TZID:Europe/Stockholm
X-LIC-LOCATION:Europe/Stockholm
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:19700308T020000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=-1SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:19701101T020000
RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=-1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTAMP:20230831T095745Z
LOCATION:Sanada I
DTSTART;TZID=Europe/Stockholm:20230626T173000
DTEND;TZID=Europe/Stockholm:20230626T180000
UID:submissions.pasc-conference.org_PASC23_sess120_msa292@linklings.com
SUMMARY:Using In-Context Learning and Frozen Large Language Models for Bay
 esian Optimization of Catalysts
DESCRIPTION:Minisymposium\n\nMayk Caldas Ramos, Shane Michtavy, Marc Poros
 off, and Andrew White (University of Rochester)\n\nLarge Language Models (
 LLM) are advanced artificial intelligence (AI) systems capable of understa
 nding and generating human-like text. In this study, we demonstrate how in
 -context learning with frozen LLMs can predict chemical properties directl
 y from experimental procedures. We developed a prompting system that enabl
 es LLMs for regression with uncertainties, which is essential for techniqu
 es like Bayesian optimization. By selecting examples for context compositi
 on, we enhance the model's performance beyond its context window, the maxi
 mum number of tokens it can process simultaneously. Although our model doe
 sn't outperform all baselines, it performs satisfactorily without training
 , feature selection, or significant computing resources. Our work highligh
 ts the potential of LLMs for efficient material and molecular design using
  natural language predictions.\n\nDomain: Life Sciences\n\nSession Chair: 
 Arvind Ramanathan (Argonne National Laboratory, University of Chicago)
END:VEVENT
END:VCALENDAR
