project screenshot 1
project screenshot 2
project screenshot 3

Coral - Tools for Media Verification

The project combines image segmentation AI models with blockchain attestation and other cryptographic tools and smart contracts to offer a solution for media verification and authentication in the age of AI. The project can go further to develop a royalty compensation system.

Coral - Tools for Media Verification

Created At

ETHGlobal Paris

Project Description

Project Overview

Addressing a pressing societal issue - ensuring the authenticity and personhood of media in the age of AI. This is a verification system that enable users to see which parts of digital content (in this case images) is AI-generated and which part is human-made.

Although the focus of the on blockchain and cryptocurrency products. However, we have decided to integrate AI technologies to add a unique and innovative flavor to our solution.

Segment Anything Model (SAM)

A new AI model from Meta AI that can "cut out" any object, in any image, with a single click. SAM is a promptable segmentation system with zero-shot generalization to unfamiliar objects and images, without the need for additional training. Source: AI Computer Vision Research - Meta AI, https://segment-anything.com

I chose this model in particular because it uses a variety of input prompts Prompts specifying what to segment in an image allow for a wide range of segmentation. With the goal of developing either a simple user interface that allows users to swipe across an image to reveal which segments are AI generated or modified. Or implementing this solution in more sophisticated products such as digital fine crypto art marketplaces, or even integrated in social networks. It is a very seamless intuitive way to interact with media and assess its humanness. Then estimating its value and price accordingly.

The Challenge: Authenticity in the Age of AI

With the advent of generative AI models, it's become increasingly simple to create images of public figures and everyday people in scenarios that never occurred in reality. This poses a significant problem for verifying the authenticity of digital content, and our solution is designed to tackle this head-on.

Our Solution: AI-Assisted Authenticity Verification

We plan to build an interface on top of SAM that enables users to distinguish between original and AI-generated content. The idea is to allow users to interact with an image (e.g., swipe across it), upon which the system would display masks marking any parts of the image edited with generative AI tools. This is merely a use-case illustration, and the actual solution will have additional features to provide a comprehensive solution to the problem.

Blockchain Component: NFT

To further enhance the verification of original content, we propose the use of blockchain technology to mint Non-Fungible Tokens (NFTs) for the original pieces of content. Regardless of how many times the content is altered or modified, users can refer to the metadata associated with the NFT and find a cryptographic signature that certifies its authenticity.

How it's Made

The product is built on both cryptographic and AI models.

Using the Ethereum Attestation Service EAS as an infrastructure to attest media artists create. When a new piece of art or media is created, its authenticity is verified through EAS solution.

The user interface is built using JavaScript (though I have not been able to complete this part during the hackathon).

MetaMask wallet SDK and IPFS are used for transactions and storage.

Furthermore, a Solidity smart contract that enables dynamic pricing and multiple payments distributed to creatives based on attributions to receive royalty-like compensations whenever their work is sold or commercialised through AI generated content.

background image mobile

Join the mailing list

Get the latest news and updates