Abstract
Since the release of OpenAI's ChatGPT, the world has entered a race to develop more capable and powerful AI, including artificial general intelligence (AGI). The development is constrained by the dependency of AI on the model, quality, and quantity of training data, making the AI training process highly costly in terms of resources and environmental consequences. Thus, improving the effectiveness and efficiency of the AI training process is essential, especially when the Earth is approaching the climate tipping points and planetary boundaries. It is evident that AI can function better when trained to mimic certain human mental processes. Based on insights from quantum mechanics and the informational entropy-based notion of value, we suggest AI developers can enhance the AI training processes' effectiveness and efficiency by creating and implementing parameter systems that are capable of effectively assigning probabilities to informational quanta. Such systems can help reduce entropy within the system, which lowers the energy required for data storage and processing while decreasing the likelihood of information loss during training. Successfully applying the concept of information entropy-based value in AI development will be crucial for advancing generative AI and achieving AGI in a sustainable manner.