Apple made an AI image tool that lets you make edits by describing them

Key Points:

  • Apple researchers released a new model for instruction-based image editing
  • The MGIE model allows users to describe image changes through text prompts
  • MGIE blends multimodal language models to interpret user prompts and make image edits

Summary:

Apple researchers have unveiled a groundbreaking model that revolutionizes photo editing by enabling users to describe desired changes through text prompts without directly engaging with editing software. The MGIE model, developed in collaboration with the University of California, Santa Barbara, can perform a myriad of editing tasks such as cropping, resizing, flipping, and applying filters using natural language instructions. By leveraging multimodal language models, MGIE interprets user prompts and visually simulates the desired edits, facilitating a seamless editing experience. For instance, by simply typing a command like “make it more healthy” on an image of a pepperoni pizza, the model intelligently adds vegetable toppings. This innovative approach marks a significant advancement in user-friendly image manipulation, offering precise editing outcomes based on explicit visual-aware intentions. The researchers behind MGIE emphasize the model’s efficiency and potential impact on future vision-and-language research. Although Apple has made MGIE accessible for download via GitHub and a web demo on Hugging Face Spaces, its long-term plans for the model remain undisclosed, hinting at further innovation in this space.

 

In comparison to existing AI image editing platforms like OpenAI’s DALL-E 3 and Adobe’s Firefly AI, Apple’s foray into generative AI with MGIE signals a strategic move towards enhancing its AI capabilities. While Adobe’s Firefly AI powers generative fill for adding backgrounds to images, MGIE sets Apple on a path to compete in the evolving landscape of photo editing tools driven by artificial intelligence. Despite lagging behind tech giants like Microsoft, Meta, and Google in the generative AI domain, Apple’s commitment to integrating AI features into its devices, as highlighted by CEO Tim Cook, underscores its commitment to drive innovation and redefine user experiences. Notably, Apple’s recent ventures, such as releasing the open-source MLX machine learning framework for training AI models on Apple Silicon chips, underscore the company’s concerted efforts to establish itself as a key player in the AI ecosystem. As Apple continues to push boundaries and explore AI-driven solutions, the unveiling of the MGIE model represents a significant leap towards reshaping the future of image editing technologies.

DAILY LINKS TO YOUR INBOX

PROMPT ENGINEERING

Prompt Engineering Guides

ShareGPT

 

©2024 The Horizon