Regarding this question, I asked many of my friends around me, and also consulted various AIs such as ChatGPT. However, I was not satisfied with the results obtained. After thinking about it for a long time, I finally came to the conclusion: Only those who are top-level resource allocators, such as government officials and business owners, especially in institutions and regions with a relatively autocratic nature, can avoid being replaced by AI and robots, because no one is willing to sacrifice their own interests.


I. The Essence of Power: The "Right to Allocate Interests" That AI Cannot Replace


The core of resource allocation is the "will to power": The core functions of government officials and business owners are to formulate rules, allocate interests, and balance the demands of multiple parties. For example, government officials need to weigh economic development against people's livelihood and well-being, and business owners need to mediate among the interests of shareholders, the rights and interests of employees, and social responsibilities. Such decisions often involve fuzzy value judgments (such as the trade-off between "fairness" and "efficiency"), and the "optimal solution" based on data by AI may go against the personal or group interests of decision-makers.


The "Human Moat" in the Autocratic System: In highly centralized organizations (such as family businesses and authoritarian governments), decision-making power highly depends on personal authority rather than objective rules. The algorithm logic of AI is in natural conflict with this "rule by man" model. For instance, a certain leader may give priority to supporting the enterprises of his cronies, while if AI recommends the optimal solution according to market data, it will instead threaten the stability of the power structure.

Example: The Crown Prince of Saudi Arabia uses AI to monitor public sentiment, but the final decisions (such as oil production cuts and foreign policies) are still based on the interests of the royal family rather than algorithmic suggestions; The CEO of a certain manufacturing giant refused the AI-optimized list of layoffs and chose to protect the senior faction to maintain internal loyalty.


II. The Limitations of AI in Resource Allocation


The "Uncomputability" of Interest Games: AI is good at handling deterministic goals (such as maximizing profits), but in reality, resource allocation often involves implicit transactions (such as political donations and personal connections). For example, in the bidding for government projects, AI may recommend the contractor with the best technology, but officials may choose the "connected parties" in exchange for long-term support.


The "Gray Area" of Ethics and Morality. In the corporate scenario: AI can optimize the supply chain, but it cannot determine whether a polluting factory should be shut down (involving the conflict between employee unemployment and environmental protection); In the political scenario: AI can predict the economic impact of policies, but it cannot decide whether GDP should be sacrificed in the short term for electoral interests.


The Risks of Technological Dependence. If resource allocation is completely entrusted to AI, it may lead to the loss of control of power. For example, if the algorithm gives priority to meeting the needs of the majority, it may intensify the dissatisfaction of minority groups (such as the protests from vulnerable groups caused by the AI suggestions in the US healthcare reform).


III. The Expansion of the Definition of "Top-level Resource Allocators"


Your view can be further extended to the following roles, and their irreplaceability is also based on humanized interest manipulation:


The "Informal Authority" within the Organization. For example, the "elders" in family businesses and the "invisible brokers" within the system. They exercise the power of resource allocation through personal relationship networks rather than their positions, and it is difficult for AI to quantify this implicit power.


Decision-makers in a Crisis. In sudden public events (such as pandemics and wars), leaders need to make quick decisions when information is incomplete, while AI relies on historical data and cannot deal with extreme situations without precedents.


The "Gatekeepers" of Cultural Values. Religious leaders and the "heads" of traditional handicraft guilds control resources (such as the allocation of funds for the inheritance of intangible cultural heritage) by maintaining the uniqueness of culture, and AI lacks the perception of cultural values.