Artificial intelligence (AI) developments are advancing rapidly, causing concerns about job security, but a new MIT study suggests that the cost of deploying AI technology currently makes it more economical for employers to retain human workers for visual tasks.
The MIT study, “Beyond AI Exposure,” considers the overlooked factor of cost when assessing AI’s impact on job security. By modeling the cost of building AI models and assessing AI adoption’s economic attractiveness, the study reveals that only 23% of worker compensation exposed to AI computer vision would be cost-effective for firms to automate. This indicates that the high upfront costs of AI systems make automation less financially viable for many tasks at present.
Despite predictions of decreasing costs as AI technology evolves, the study emphasizes that even with a 20% annual cost reduction, it would take decades for computer vision automation to become economical for most firms. The study’s framework and results aim to provide a realistic perspective on the future of AI’s impact on various job roles.