Skip to main content

About the Migration Assistance API

The Migration Assistance API enables you to migrate data from external systems into the Akkuro Lending platform efficiently and securely. The process begins by uploading your data to an isolated staging area, where it can be validated without impacting live systems. After validation is complete and all issues are resolved, you can safely import the data into the target environment. Once imported, you can run verification counts to confirm that the data was transferred accurately.

Currently, you can migrate data into the following Akkuro Lending domains (referred to as components in the API schema):

  • Counterparties: Data about borrowers (individuals or organisations).
  • Collaterals: Data about assets or items pledged as security for a loan.
  • Financing Execution: Data related to financial and legal actions that occur after a loan agreement is activated.
  • Financing Solution: Data that describes the loan structure after the loan agreement is finalized, such as the loan purpose, amount, type, interest rate, withdrawal, and repayment method.
  • Administrate: Data related to payment administration generated from imported financing executions and solutions.

You can find the migration APIs for each domain by going to the corresponding functional group on the API schemas.

Data migration steps

The data migration process involves the following steps to ensure that your data is transferred accurately and securely from external systems to the Akkuro Lending platform. Each stage is managed through API calls that handle specific tasks in sequence:

  1. Register migration: Start by registering a new migration process. This step returns a migration-reference-ID, which is used to identify and track the entire migration process.

  2. Load data: Stream your data to the staging area. The staging area acts as a temporary workspace where data is stored and prepared before being imported into the target environment.

  3. Validate data: Start a validation job to check and verify data. The system checks for missing fields, invalid formats, or rule violations. Validation must succeed before importing.

  4. Import data: Once validation is successful, import the data into the target environment. This step makes the data officially available in the live environment.

  5. Verify counts: After importing, start a verification counts job to compare the number of staged records against the number of imported entities. This step helps confirm that all data was transferred accurately.

Monitoring and logging: During steps 2–5, you can track the migration's progress and review logs. Monitoring helps ensure data integrity and provides visibility into any errors or warnings that may occur.

For a more detailed guide on how to migrate data, refer to the Migrate data using the migration API page.

Migrated data format

The migration API supports JSON Lines (JSONL) and JSON Text Sequences (JSON-seq) formats for streaming data.

  • JSONL: Each line represents an individual JSON object, separated by newline characters (\n).
  • JSON-seq: A sequence of multiple JSON objects, each prefixed with a special record separator character (0x1E) to distinguish individual objects.

Both formats are ideal for large-scale or incremental data migration. They allow you to continuously stream individual records to the staging area, rather than sending a single large JSON file.

Authentication

Contact your Customer Service representative to obtain valid API credentials. Then, refer to Authentication and authorization for implementation details.

Data privacy

To ensure the safety and confidentiality of user data during migration, the following privacy measures are enforced:

  • No personal data in migration logs and verification counts: Migration logs only record reference IDs, unique codes used to track the progress. Verification count results only contain aggregate numbers and category labels. These do not contain any Personally Identifiable Information (PII) such as names, email addresses, or phone numbers. This protects user privacy and reduces risk in case logs are accessed.
  • Isolated staging environments: Each migration process uses a separate staging area to temporarily store data. This isolation ensures that data from separate migrations, identified by unique migration reference IDs, remains segregated. This prevents accidental exposure or cross-contamination between datasets and allows easy management.
  • Audit trail for compliance: All migration activities are fully tracked through an audit trail. These records help ensure compliance with data protection regulations and support accountability.

Access control

To maintain security and prevent unauthorized access during data migration, the system enforces strict access control policies:

  • Migration permission: Only users with the appropriate permissions can receive valid API credentials to perform migration tasks. This ensures that data migrations are handled only by authorized personnel.
  • Granular domain-level restrictions: Access can be restricted at the domain level, meaning users can be granted access only to a specific domain of the system, such as Counterparty, Collateral. This minimizes exposure and enhances data protection.
  • Read-only access for monitoring: Users who need to observe the migration process without making changes can be granted read-only access. This allows for safe monitoring and auditing without the risk of accidental modifications.