Upload a file to Azure Data Lake Storage.

yaml
type: "io.kestra.plugin.azure.storage.adls.List"

List all files and directories in a specific Azure Data Lake Storage directory and log each file data output.

yaml
id: azure_data_lake_storage_list
namespace: company.team

tasks:
  - id: list_files_in_dir
    type: io.kestra.plugin.azure.storage.adls.List
    connectionString: "{{ secret('AZURE_CONNECTION_STRING') }}"
    fileSystem: "tasks"
    endpoint: "https://yourblob.blob.core.windows.net"
    directoryPath: "path/to/my/directory/"

  - id: for_each_file
      type: io.kestra.plugin.core.flow.EachParallel
      value: "{{ outputs.list_files_in_dir.files }}"
      tasks:
        - id: log_file_name
          type: io.kestra.plugin.core.debug.Echo
          level: DEBUG
          format: "{{ taskrun.value }}"
Properties

Directory path

Full path to the directory

The blob service endpoint.

The name of the file systems. If the path name contains special characters, pass in the url encoded version of the path name.

Connection string of the Storage Account.

The SAS token to use for authenticating requests.

This string should only be the query parameters (with or without a leading '?') and not a full URL.

Shared Key access key for authenticating requests.

Shared Key account name for authenticating requests.

SubType

The list of file.

SubType string
Format date-time
Format date-time
Possible Values
INFINITEFIXED
Possible Values
AVAILABLELEASEDEXPIREDBREAKINGBROKEN
Possible Values
LOCKEDUNLOCKED
Format uri