How-To: Encrypt Input Message

Prerequisites

  1. Encryption Keys:

    • A Secret Key must be defined in the Query Client to encrypt and decrypt messages.

    • The corresponding Evaluation Key must be deployed in LatticaAI.

    If the key pair is already defined, you do not need to create it again.

  2. User Access Token:

    • The user must have a valid Access Token, which provides permission to interact with the AI model.

    • Tokens are unique to each user and model.


Use the following code snippet to encrypt the input message. The encryption process takes the User Access Token and the message to be encrypted as parameters:

import lattica_common.app_api as agent_app

# Notice your query token expires in 30 days
query_token = "the_query_token_you_got_using_the_generate_user_token"

# user_data is a tuple of: 
# (serialized_context, serialized_secret_key, serialized_homseq)
# which you need for encrypting the query and querying the model
user_data = agent_app.user.query_offline_phase(query_token)

dataset = pd.read_csv('data/mnist_data.csv').values / 255
data = torch.tensor(dataset[0])
serialized_ct = agent_app.user.encrypt(user_data, dataset)

This step focuses on a specific part of the query process: Input message encryption. If you prefer to perform encryption, query execution, and decryption in a single command, refer to [How-To: Encrypt, Execute, and Decrypt in One Step].

Last updated

Was this helpful?