Skip to main content

Migrate users to AuthN

Users can be imported into AuthN using a comma-separated values (CSV) file.

During the import process, you have the option to map fields between the files and the user data in AuthN. When importing users with hashed passwords, additional data such as the hash algorithm, salt, salt order, number of hash iterations, etc., is required.

You can import users from any system using CSV files. Below are instructions on migrating users from Firebase and Auth0, which involve additional preparation of user data before it can be imported into AuthN.

note

Both Firebase and Auth0 allow the export of user data in JSON format, which might be easier to process if you need to make changes to the data. Currently, AuthN supports direct import only from a CSV file, but you can convert your JSON data to CSV using a script. An example script for converting JSON to CSV is provided at the end of this document.

Firebase migration

Export users with Firebase CLI

The first step in migrating users from Firebase is to export them:

  1. Install the Firebase Command Line Interface (CLI) to enable exporting users using a command in the terminal.

  2. In a terminal window, use Firebase CLI to export users from your Firebase project:

    1. Sign in to your Firebase account:

      firebase login
    2. List your Firebase projects:

      firebase projects:list
    3. Export users replacing the <project-id> placeholder with the actual Project ID value in the following command:

      firebase auth:export firebase-users.csv --format=CSV --project <project-id>

      This will export your users into a file with the specified name.

Get additional data required for importing password hashes

  1. Sign in to your Firebase console and open your project.

  2. Select Authentication and then select the Users tab.

  3. Next to the Add Users button, click the three dots button to open the menu, and then select Password hash parameters. Your parameters should resemble the following:

    hash_config {
    algorithm: SCRYPT,
    base64_signer_key: XXXX/XXX+XXXXXXXXXXXXXXXXX+XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX==,
    base64_salt_separator: Aa==,
    rounds: 8,
    mem_cost: 14,
    }

    You will use these values when importing your user data with hashed passwords.

Prepare Firebase CSV for import by AuthN

Import users from Firebase

  1. Open the AuthN Users page in the Pangea User Console.

  2. Click + New.

  3. Click Import Users. This will open an Import users dialog.

  4. Click Import CSV. This will open a file selector.

  5. Navigate to the CSV file containing your user data, select it, and submit the file selection. This will open a User profile mapping dialog.

  6. In the dialog, map the CSV fields to the AuthN user fields:

    • Map the Required email field to the corresponding field in your user data with valid email values.

    • If desired, you can also map the Optional fields.

      If your user data contains password hashes, you can map it to AuthN's password_hash field. Importing password hash allows your users to sign in and complete the enrollment process during their first login. To ensure this works correctly, you need to provide the correct values from your Firebase hash config:

      • Map the Password Hash field in the Firebase data to the password_hash field in AuthN. This will display a Password Hash section at the bottom of the User profile mapping dialog.
      • In the Password Hash section, for Hash Algorithm, choose FIREBASE-SCRYPT. This will open a Hash Parameters dropdown.
      • In the Hash Parameters dropdown, select Manual. This will display inputs for additional password hash parameters. You may need to scroll down to see all of these parameters. Some of them may be pre-populated by default:
        • Set Cost Factor to match the mem_cost in the Firebase hash config, which is 14 by default.
        • For Parallelism, enter 1.
        • Set Block Size to match the rounds in the Firebase hash config, which is 8 by default.
        • For Salt Separator, enter the value from base64_salt_separator in the Firebase hash config.
        • For Signer Key, enter the value from base64_signer_key in the Firebase hash config.
        • For Salt, select Per-User. This will display Per-User Salt Field.
        • For Per-User Salt Field, choose the Password Salt field from the Firebase CSV.

      During their first authentication, imported users will be required to complete their enrollment process.

  7. Click Import. This will take you back to the Import Users dialog with the list of recent imports.

  8. Click Done.

  9. Refresh the AuthN Users page in the Pangea User Console to view the newly imported user records.

If you encounter errors during the import process, ensure that your user data is correctly mapped to the required AuthN fields and that it contains valid values.

You can import a CSV file multiple times without overwriting existing users with the same email addresses. Only users with unique email addresses are processed and added to AuthN.

Auth0 migration

Export Auth0 user data

The first step in migration is to export existing users from Auth0.

Export users using Auth0 Management API

  1. Navigate to Auth0 Dashboard and select your tenant.

  2. Enable Auth0 API explorer under Applications >> APIs >> API Explorer.

  3. Collect your Auth0 API explorer application information for making API requests:

    • Under Applications >> APIs:
      • API Audience for Auth0 Management API
    • Under Applications >> Applications >> API Explorer Application >> Settings:
      • Domain
      • Client ID
      • Client Secret
    • Under Authentication >> Database >> Username-Password-Authentication:
      • Database Identifier

    In the following examples, replace the placeholders with the values specific to your application:

    export AUDIENCE="<API Audience>"
    export DOMAIN="<Domain>"
    export CLIENT_ID="<Client ID>"
    export CLIENT_SECRET="<Client Secret>"
    export CONNECTION_ID="<Database Identifier>"
  4. Request an access token:

    curl --location "https://$DOMAIN/oauth/token" \
    --header 'Content-Type: application/json' \
    --data '{
    "client_id": "'"$CLIENT_ID"'",
    "client_secret": "'"$CLIENT_SECRET"'",
    "audience": "'"$AUDIENCE"'",
    "grant_type": "client_credentials"
    }'

    The response will contain an access token in the "access_token" field:

    {
    "access_token": "eyJ...BFQ",
    "scope": "read:client_grants...delete:client_credentials",
    "expires_in": 86400,
    "token_type": "Bearer"
    }

    Use this token in your next request:

    export ACCESS_TOKEN="<access_token>"
  5. Make a request to export the users. For example:

    curl --location "https://$DOMAIN/api/v2/jobs/users-exports" \
    --header "Authorization: Bearer $ACCESS_TOKEN" \
    --header 'content-type: application/json' \
    --data '{
    "connection_id": "'"$CONNECTION_ID"'",
    "format": "csv"
    }'
    note

    The above example requests users for a database connection. To export all tenant users, only specify the "format" parameter.

    Conversely, you can include additional criteria to further limit the number of exported user records and/or specify what fields to export. For example:

    curl --location "https://$DOMAIN/api/v2/jobs/users-exports" \
    --header "Authorization: Bearer $ACCESS_TOKEN" \
    --header 'content-type: application/json' \
    --data '{
    "connection_id": "'"$CONNECTION_ID"'",
    "format": "csv",
    "limit": 5,
    "fields": [
    {
    "name": "user_id"
    },
    {
    "name": "email"
    },
    {
    "name": "identities[0].connection",
    "export_as": "provider"
    }
    ]
    }'

    You can learn more in Auth0 documentation .

    The response will contain a reference to the export job under the "id" field.

    {
    "connection": "Username-Password-Authentication",
    "connection_id": "con_XXXXXXXXXXXXXXXX",
    "format": "csv",
    "id": "job_v03DmlXLgekQfZXW",
    "status": "pending",
    "type": "users_export",
    ...
    }

    Use this job ID in your next request:

    export JOB_ID="<id>"
  6. Request the resulting download URL for the job. For example:

    curl --location "https://$DOMAIN/api/v2/jobs/$JOB_ID" \
    --header "Authorization: Bearer $ACCESS_TOKEN"

    In the "location" field within the response, find a presigned URL, which you can use to download the exported user data:

    {
    "type": "users_export",
    "status": "completed",
    ...
    "location": "https://l0-prod-prod-us-4-usw2-export-users.s3.us-west-2.amazonaws.com/job_v03DmlXLgekQfZXW/dev-xxxxxxxxxxxxxxxx.csv.gz?X-Amz-Algorithm=AWS4-HMAC-SHA256...&X-Amz-SignedHeaders=host"
    }
    note

    The downloaded file could be a gzip archive. In that case, you'll need to extract the CSV file before importing it by AuthN. For example:

    gunzip dev-tenant.csv.gz

Export users using Auth0 Admin UI

  1. In the Auth0 Dashboard, navigate to Extensions.

  2. If you haven't installed it already, locate and install the User Import / Export extension.

    The User Import / Export extension found under Extensions in Auth0 Admin UI

  3. Under Installed Extensions, open the User Import / Export extension.

  4. If prompted, accept the authorization request from the extension to access your Auth0 account.

  5. Click Export.

  6. Optionally, specify the fields to export and the corresponding column names in the exported data. By default, all available for download fields will be included.

  7. Select the CSV option in the Export Format dropdown.

  8. Optionally, select your user database in the Connection dropdown.

  9. Click Export Users to download the user data as a CSV file.

User Export Extension

Request password hashes

By default, password hashes are not included in the user data exported from an Auth0-hosted database. To add password hashes to the user data imported by AuthN, you will need to open a ticket with Auth0 support. Note that requesting this service is not currently available for the Auth0 free subscription tier. Refer to Auth0 documentation for details.

Import users from Auth0

note

As of May 2024, user data exported from Auth0 in CSV format contains a leading single quote character. This character might invalidate some data, such as email addresses, which are required by AuthN. Therefore, it needs to be removed before import.

  1. Open the AuthN Users page in the Pangea User Console.

  2. Click + New.

  3. Click Import Users. This will open an Import users dialog.

  4. Click Import CSV. This will open a file selector.

  5. Navigate to the CSV file containing your user data, select it, and submit the file selection. This will open a User profile mapping dialog.

  6. In the dialog, map the CSV fields to the AuthN user fields:

    • Map the Required email field to the corresponding field in your user data with valid email values.

    • If desired, you can also map the Optional fields.

      If your user data contains password hashes, you can map it to AuthN's passwordHash field. Importing password hash allows your users to sign in and complete the enrollment process during their first login.

      • Map the passwordHash field in the Auth0 data to the password_hash field in AuthN. This will display a Password Hash section at the bottom of the User profile mapping dialog.
      • In the Password Hash section, for Hash Algorithm, choose BCRYPT.

      During their first authentication, imported users will be required to complete their enrollment process.

  7. Click Import. This will take you back to the Import Users dialog with the list of recent imports.

  8. Click Done.

  9. Refresh the AuthN Users page in the Pangea User Console to view the newly imported user records.

If you encounter errors during the import process, ensure that your user data is correctly mapped to the required AuthN fields and that it contains valid values.

You can import a CSV file multiple times without overwriting existing users with the same email addresses. Only users with unique email addresses are processed and added to AuthN.

Example script for converting JSON to CSV

The following script uses JSON keys to add missing headers to the resulting CSV file.

JSON-to-CSV
import json
import csv


def convert_json_to_csv(json_file_path, csv_file_path):
    # Open the JSON file and load the data
    with open(json_file_path, 'r') as file:
        data = json.load(file)

    # Open a CSV file for writing
    with open(csv_file_path, 'w', newline='') as file:
        csv_writer = csv.writer(file)

        users = data["users"]
        print(len(users))
        # Write the header row based on keys from the first element of the data
        if users:
            csv_writer.writerow(users[0].keys())

            # Write data rows
            for item in users:
                csv_writer.writerow(item.values())

    print(f"CSV file has been created successfully at {csv_file_path}.")


# Get file paths from user
input_json_file = input("Enter the path to your input JSON file: ")
output_csv_file = input("Enter the path for the output CSV file: ")

# Call the function to convert JSON to CSV
convert_json_to_csv(input_json_file, output_csv_file)

Was this article helpful?

Contact us