Humanity's goals and AI
How to empower humans to systematically define collective goals?
While the AI technologies are advancing rapidly, we still have world conflicts between nations, corporations, and individuals, where people don’t really know what they truly want. While this does not seem like a major risk today, with the prospects of AI becoming super-intelligent, it is an imperative for humanity to collectively define its goals to guide all optimization systems (the alternative is that the AI will do it for us, and may be just one simple idea away from this happening). We do not yet have a way to collectively define goals together by participation of all life, and we don’t seem to really know what they want yet.
The ideas of the systems to define and align goals are welcome, and suggestions to improve this question.
Public self-explanatory, inter-lingual, financial, programmable, hierarchical think-tank
Prioritize the use of AIs to empower humans to communicate and decide together, and it won't take over the world.
Want to abandon personal goals like fame, money, etc...?
I think the ideal situation is that individuals keep their own goals, but not only blindly pursue personal goals, the common risks of human beings really need us to solve together, super artificial intelligence is one type of risk, and environmental degradation is another type.