AI more dangerous than nuclear warfare?
Are ChatGPT Plugins the Next App Store?
Are ChatGPT Plugins the Next App Store?
Sparks of Artificial General Intelligence: Early Experiments with GPT-4
Source: Microsoft Research Abstract: The latest model developed by OpenAI, GPT-4 was trained using an unprecedented scale of compute and data. In this paper, we report on our investigation of an early version of GPT-4, when it was still inactive development by OpenAI. We contend that (this early version of) GPT4 is part of a […]
GPT could effect 80% of the US workforce – can help those who lack skills…
Goldman Sachs: generative AI could expose the equivalent of 300mn full-time jobs to automation.
AI to impact 80% of jobs
Google Releases BARD – Can it take on GPT4?
FlexiViT: One Model for All Patch Sizes
Source: Google Vision Transformers convert images to sequences by slicing them into patches. The size of these patches controls a speed/accuracy tradeoff, with smaller patches leading to higher accuracy at greater computational cost, but changing the patch size typically requires retraining the model. In this paper, we demonstrate that simply randomizing the patch size at […]
NVIDIA Increases GPU Speeds by 40x
Reflexion: an autonomous agent with dynamic memory and self-reflection
Abstract: Recent advancements in decision-making large language model (LLM) agents have demonstrated impressive performance across various benchmarks. However, these state-of-the-art approaches typically necessitate internal model fine-tuning, external model fine-tuning, or policy optimization over a defined state space. Implementing these methods can prove challenging due to the scarcity of high-quality training data or the lack of […]
Reflexion: an autonomous agent with dynamic memory and self-reflection
Abstract: Recent advancements in decision-making large language model (LLM) agents have demonstrated impressive performance across various benchmarks. However, these state-of-the-art approaches typically necessitate internal model fine-tuning, external model fine-tuning, or policy optimization over a defined state space. Implementing these methods can prove challenging due to the scarcity of high-quality training data or the lack of […]
Baidu releases Ernie bot to compete with ChatGPT
Can ChatGPT Improve Investment Decision? From a portfolio management perspective
ChatGPT as a large language model (LLM) has recently gained significant attention for generating human-like text. While much of the existing research has focused on its applications for writing and language translation, the potential of ChatGPT in finance, particularly in the context of investment, remains unexplored. This research aims to examine the efficacy of ChatGPT […]
Can ChatGPT Improve Investment Decision? From a portfolio management perspective
ChatGPT as a large language model (LLM) has recently gained significant attention for generating human-like text. While much of the existing research has focused on its applications for writing and language translation, the potential of ChatGPT in finance, particularly in the context of investment, remains unexplored. This research aims to examine the efficacy of ChatGPT […]
PaLM-E: An Embodied Multimodal Language Model
Source: Google Research Abstract: Large language models have been demonstrated to perform complex tasks. However, enabling general inference in the real world, e.g. for robotics problems, raises the challenge of grounding. We propose embodied language models to directly incorporate real-world continuous sensor modalities into language models and thereby establish the link between words and percepts. […]
LLaMA: Open and Efficient Foundation Language Models
Source: Meta Abstract: We introduce LLaMA, a collection of foundation language models ranging from 7B to 65B parameters. We train our models on trillions of tokens, and show that it is possible to train state-of-the-art models using publicly available datasets exclusively, without resorting to proprietary and inaccessible datasets. In particular, LLaMA-13B outperforms GPT-3 (175B) on […]
LaMDA: Language Models for Dialog Applications
We present LaMDA: Language Models for Dialog Applications. LaMDA is a family of Transformer-based neural language models specialized for dialog, which have up to 137B parameters and are pre-trained on 1.56T words of public dialog data and web text. While model scaling alone can improve quality, it shows less improvements on safety and factual grounding. […]
Disney and NBC Eyeing New York’s AI Tax Break Ban Proposal
SingSong: Generating musical accompaniments from singing
Source: Google We present SingSong, a system that generates instrumental music to accompany input vocals, potentially offering musicians and non-musicians alike an intuitive new way to create music featuring their own voice. To accomplish this, we build on recent developments in musical source separation and audio generation. Specifically, we apply a state-of-the-art source separation algorithm […]