Uncategorized

Fear of missing out vs — Opinion — The Guardian Nigeria News – Nigeria and World News



Artificial General Intelligence (AGI). Photo:Linkdeln

The excitement of Artificial General Intelligence (AGI) as a viable tool for improving organisational effectiveness and quality service delivery is understandable because of the prospect of changes it holds.

AGI is a proposed type of artificial intelligence that would possess human-level intelligence and capabilities.

The AGI can learn, adapt and solve problems hence, this evolving field will crystalise a different level of risk. For most Chief Executive officers (CEOs), their interest in AGI is driven by FOMO (Fear Of Missing Out). Apparently, AGI has put valuation on the upward trend but the underlying risk of AGI which has only been speculated is causing them to foot-drag.

Hence, the CEOs are at the crossroad on making decisions on how they can harness the benefit of AGI, how to start, the magnitude of change they should allow, what resource exist to minimise any negative implications, the size of investment they should put into the AGI infrastructure, would there be significant Return on investment or will the AGI just become another instrument of brand brandishing.

While we agree that AGI will change the risk management landscape significantly, the risk profile varies across types. The early era AI that is mainly for predictive and loss prevention purposes has low operational risk and its accuracy is manageable within an organisation. Its accuracy is direct proportional to the quality of training data and its leaning algorithm. Conversely for the evolving AGI systems which uses unsupervised learning method on unstructured data, the risk is certainly composite and will require tectonic shift in approach to risk management.

Perception risk which is an implied social pitfall of AGI is a major factor for service industries because AGI tends to make organisation more impersonal. In their bid to improve their customer service delivery they confront customers with AGI generated and prerecorded audios thus becoming a faceless entity to customers. The more faceless an organisation becomes the less loyalty it garners from the customers.

To put into perspective, there is a transaction threshold above which a customer will want to discuss with a human. Hence the use of AGI in customer service must be skewed in a way to manage their uniqueness and accommodate their service preferences. While it can be used as a tool of proactiveness, the risk of misrepresentation must be intentionally minimized.
Expectedly, Credit risk management is one area the AGI could be of tremendous advantage, as it may give management the hindsight to proactively hedge an impending non-performance.

It provides an opportunity to monitor the quality of credit on an ongoing basis with the use of a set of data that relates to customers lifestyle, contemporaneous financial records (even with third party) and social tendencies. Although, this will be highly dependent on Third-party sourced data, it will be a mistake to only make credit decisions on those unverifiable third-party data.

Hence, the investment on AGI in credit administration should be tepid and most importantly, AGI should be used as decision-accelerator tool (Human validation of statistical decision) and present data but not make final decisions no matter how insignificant they may seem.

On the Operation front, as expected AGI has the capability to enhance operational or process efficiency because its functionalities are not dependent on data crunching but on spatial pattern recognition and matching.

For the manufacturing sector, the outright replacement of humans on the manufacturing line may be a way to go in the long run, but in the midterm only moderate investment should be sought. This is because the process of using device-drivers in manufacturing is largely logic driven. On a cautionary note, most operators are not convinced that deployment of AGI-based device drivers won’t create new types of risks – imagine the AGI going rouge in the manufacturing line. Regardless of the level of reliability, there are functionalities humans cannot surrender to AGI machine to perform.

Additionally, the risk of Internal abuse and misuse are certainly a huge area of concern, a natural wisdom may dictate that AGI should be limited to performance enhancement.

It is likely that this can be abused through manipulation, out-right data corruption, or unauthorised tweaking of learning algorithm to perpetrate fraud which can lead to loss of investment. Grasping the extent to which AGI network can be intentionally manipulated is vast and could be cost intensive to mitigate without committing to three-dimensional security architecture.

Moreover, AGI can be very effective in Report rendition as it can trigger inconsistencies through trend monitoring. Great caution should be exercised when its being deployed for reconciliation, reclassification or account regularisation. It is pertinent to know that this risk can only be tamed through design-level mitigation, internal and external structure that will on a continuous basis evaluate the veracity of the AGI network.

While we do not want CEOs to traffic in pessimism, it is important to inform them that investment in evolving technology such as AGI should be moderate and largely focused on process improvement and not an outright process replacement.

The investment should seek to support human employees to be the best they could. Be focused solely on improving intractability with customers and place the company a step ahead of the customers when it comes to service needs and delivery.

Deeper insights reveal that, without proper and thorough planning, the investment required to maintain and mitigate the risk of AGI network in service may outweighs the cost of its acquisition or deployment. The system can be designed to cope with the issues of deep-fake, digital versioning, and pattern mismatch, which are caused by external factors, by increasing its monitoring. But the problem of data tampering or algorithm alteration, which are due to internal breaches, are hard to notice and root out.

Finally, investment in AGI by CEOs should be focused on gradual implementation and organic growth within the system, which must be augmented with continuous customer education and robust communication. Moreso, it is advisable that the security architecture should be a combination of architecture-level security implementation, internal and external control.

This is necessary to have counter-monitoring system in other to balance views and outputs. Before investing in the deployment of AGI tools, it is more important to develop internal capacity to reduce risk.

AGI tools are constantly changing and trying to keep up with them could compromise the quality and efficiency of the process and service delivery system. From a global perspective, AGI will undoubtedly transform how we communicate, how services are rendered and remodel our lifestyles. The feel of FOMO is good and is emphasised because honestly, those who start early will have an advantage.

Bakre is a Digital Ethicist and Managing Partner, Homo Economicus Limited.





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *