Crypto exchange Coinbase has evaluated Chatgpt as a token verification tool, comparing it to their standard security protocol. About fifty percent of the time, the AI platform delivered the same conclusions as the manual assessment but failed to identify certain high-risk assets.
Coinbase Explores Using AI Tool Chatgpt For Secondary Token Risk Control
Digital asset exchange Coinbase has evaluated the artificial intelligence (AI) chatbot built by Openai for streamlining token reviews. The U.S.-based trading platform stated that Chatgpt needed to be more accurate to be instantly incorporated into its asset screening process but that it possessed sufficient promise to warrant further examination.
The initiative is part of Coinbase’s commitment to evaluating token contracts using efficient and effective approaches prior to listing the assets. The Blockchain Security team at the exchange stated that they use in-house automation tools developed to assist security engineers in reviewing ERC20/721 smart contracts.
The exchange further explained its initiative to incorporate AI into the process of reviewing token contracts. The exchange noted that the emergence of ChatGPT by OpenAI and the associated hype around its capability to identify security vulnerabilities prompted the testing of the AI tool on a large scale rather than as a one-time code reviewer.
Coinbase explained that ChatGPT exhibits the potential to enhance productivity in various development and engineering tasks. Additionally, AI technology can aid in code optimization and detecting security vulnerabilities.
The top cryptocurrency exchange in the United States undertook the experiment to compare the accuracy of a token security evaluation conducted by Chatgpt to that of a normal review conducted by a blockchain security engineer using internal tools. To generate equivalent risk rankings, the chatbot must be instructed on recognizing risks as outlined by the platform’s security assessment criteria.
The researchers compared 20 risk assessments for smart contracts generated by Chatgpt and a manual security review. While the AI tool produced the same conclusions as the manual assessment twelve times, five of the eight misses were due to Chatgpt mislabeling a high-risk asset as a low-risk one. In a blog post, the exchange emphasized that underestimating a risk score can be significantly more harmful than overestimating it.
Despite this “worst case failure” and the tool’s tendency to provide contradictory responses when asked the same question numerous times, Coinbase claims that the Chatgpt review has been extraordinarily effective. The company anticipates that the tool’s precision can be enhanced through expedited engineering.
Coinbase has concluded that the bot cannot be depended upon entirely to execute a security audit at this time. The exchange also highlighted that in the event that its team can enhance the accuracy, the AI tool could serve as a secondary quality assurance check. This means that its engineers may be able to use it to do additional control checks and discover risks that may have gone undetected.
This year, the Chatgpt platform from Openai has been in the spotlight due to the increasing demand for artificial intelligence applications. Earlier in March, the world’s largest cryptocurrency exchange, Binance, announced the introduction of a new, AI-centric, non-fungible token (NFT) platform.