The multifaceted challenge of powering AI
Providing electricity to power-hungry data centers is stressing grids, raising prices for consumers, and slowing the transition to clean energy.
Learn about artificial intelligence, GPT usage, prompt engineering and other technology news and updates from Land of GPT. The site aggregates articles from official RSS feeds under their original authorship. Each article has a do-follow link to the original source.
Providing electricity to power-hungry data centers is stressing grids, raising prices for consumers, and slowing the transition to clean energy.
Rapid development and deployment of powerful generative AI models comes with environmental consequences, including increased electricity demand and water consumption.
Assistant Professor Manish Raghavan wants computational techniques to help solve societal problems.
Biodiversity researchers tested vision systems on how well they could retrieve relevant nature images. More advanced models performed well on simple queries but struggled with more research-specific prompts.
A new technique identifies and removes the training examples that contribute most to a machine-learning model’s failures.
Research from the MIT Center for Constructive Communication finds this effect occurs even when reward models are trained on factual data.
Using LLMs to convert machine-learning explanations into readable narratives could help users make better decisions about when to trust a model.
Researchers develop “ContextCite,” an innovative method to track AI’s source attribution and detect potential misinformation.
MIT engineers developed the largest open-source dataset of car designs, including their aerodynamics, that could speed design of eco-friendly cars and electric vehicles.
This new device uses light to perform the key operations of a deep neural network on a chip, opening the door to high-speed processors that can learn in real-time.