Saturday, April 27, 2024

Makati law firm impressed by new AI-powered software from SG

While artificial intelligence (AI) chatbots cannot – and most likely will not – replace actual lawyers in the near future, it will make legal research easier, faster, and more efficient.

This is the main takeaway in a talk delivered by Rachel Follosco, senior partner at Makati-based Follosco Morallos & Herce (FMH) law firm, regarding their experience using a pilot version of an AI-powered solution developed by Singapore-headquartered tech firm Straits Interactive.

According to Follosco, the AI solution called Capabara that it is testing is capable of answering legal queries in less than a minute. An associate lawyer in the firm, on the other hand, needs more than two hours to accomplish the same task.

While the output produced by the AI software requires further vetting by a real lawyer, Follosco said the speed and efficient manner by which it was able to answer the legal query is a major breakthrough that can relieve lawyers from the burden of legal research.

Aside from being a capable legal researcher, Capabara can also perform the job of a data protection officer (DPO) assistant, said Straits Interactive CEO Kevin Shepherdson during the media launch last Feb. 2 at the New World Hotel in Makati City.

Capabara, he said, can serve as an crucial tool for organizations seeking to govern personal data and align themselves with data protection laws, including the Data Protection Act (DPA), while still being assisted by consultants from Straits Interactive.

Shepherdson said the system uses Straits Interactive’s “brain” – which is like a database composed of data protection regulations, enforcement cases, and advisory insights.

“Armed with this wealth of knowledge, businesses can upload their own internal documents and enjoy the benefits of conversational AI,” the company said.

According to the tech firm, Capabara is designed to harness the power of generative AI with ChatGPT-like capabilities that allow organizations to develop and manage their digital transformation goals, all while promoting the safe and responsible use of AI.

“We envision the Capabara Capability-as-a-Service to be a platform ecosystem meticulously designed for the AI Business Professional,” Shepherdson.

“These are the trailblazers equipped to harness generative AI tools, create value, and integrate AI ethically, responsibly, and effectively within their organizations.”

Straits Interactive country manager in the Philippines Edwin Concepcion said the company’s local unit is ready for the software’s launch in the country.

“We are confident that Capabara will have transformative effects on business workflows, and will propel businesses towards greater productivity in the dynamic landscape of the Philippines,” he said.

Straits Interactive CEO Kevin Shepherdson

Meanwhile, the Data Protection Excellence (DPEX) Centre, the research arm of Straits Interactive, has released findings on the data practices of 100 mobile “Clone Apps” leveraging OpenAI’s GPT APIs on the Google Play Store.

The research uncovered significant discrepancies between declared data safety practices and the actual behavior of these apps, which pose potential privacy risks to individual and corporate users alike.

With the rising popularity of ChatGPT features being replicated in mobile apps, the study aimed to evaluate the declared data safety information of ChatGPT-based apps and delve deep into the permissions they actually solicit, particularly regarding personally identifiable information (PII).

The research undertook an analysis of 100 apps selected from the Google Play Store with the search term ‘ChatGPT’, representing a combined total of 44 million downloads.

The study found that 46% seemingly confidently asserted on the app download page (within the Data Safety Section) that they did not collect any PII (Personal Identifiable Information). Contrarily, the App Permissions Section indicates that chat conversations, classified under PII, are vulnerable to harvesting.

68% of the apps claimed no third-party PII sharing, but their reliance on the application programming interface (API) from OpenAI — a third party — directly contradicts this claim. Alarmingly, 42% admitted that once the user PII is collected, “the data can’t be deleted.”

The disparities cast shadows over the privacy practices of these apps, especially considering that 58% of them have user ratings averaging 4.3, probably indicating their ease of use and/or effectiveness, etc. rather than taking data privacy risks into account.

Importantly, five apps featured among the Top 30 free productivity apps, while four ranked within the Top 20 for gross revenue.

Among the 46 apps that said they do not collect any PII, a startling 29 actually requested at least one app permission that would enable them to collect PII.

The research pointed out a mismatch between the Data Safety declarations and the permissions these apps actively solicit. Notably, the apps requested:

● Access to Microphone: 41%
● Ability to Modify/Delete Photos/Media/Files: 36%
● Ability to Modify/Delete Storage Contents: 34%
● Access to Device ID & Call Information: 13%

Furthermore, a significant 72% of all the apps requested at least one PII-related permission. Worryingly, 35 apps could gather “Device and other IDs,” potentially identifying all user account IDs.

“While apps replicating ChatGPT’s features are proliferating, our findings serve as a crucial alert. Users often place faith in app declarations. However, the need for heightened vigilance is evident given the disparities we’ve identified,” said Shepherdson.

“Developers, on the other hand, ought to prioritize transparency, ensuring their Data Safety claims align coherently with the permissions they request and the inherent data flows of their apps.”

According to Lyn Boxall, a legal privacy specialist at Lyn Boxall LLC and a member of the research team, “From a legal standpoint, the discrepancies between an app’s declarations on data safety and its actual behaviors are not just alarming but potentially actionable.

“Users have the right to transparency and truthfulness when it comes to their PII. Misrepresenting data practices can lead to serious regulatory repercussions, especially under increasingly stringent data protection laws worldwide. App developers need to recognize that privacy is not just an ethical obligation but also a legal one. The stakes are high, both in terms of potential penalties and loss of user trust.”

Subscribe

- Advertisement -spot_img

RELEVANT STORIES

spot_img

LATEST

- Advertisement -spot_img