Build with an Azure free account. Get USD200 credit for 30 days and 12 months of free services.

Start free today

Run Azure Functions from Azure Data Factory pipelines

Play Run Azure Functions from Azure Data Factory pipelines
Sign in to queue


Azure Functions is a serverless compute service that enables you to run code on-demand without having to explicitly provision or manage infrastructure. Using Azure Functions, you can run a script or piece of code in response to a variety of events. Azure Data Factory (ADF) is a managed data integration service in Azure that enables you to iteratively build, orchestrate, and monitor your Extract Transform Load (ETL) workflows. Azure Functions is now integrated with ADF, enabling you to run an Azure function as a step in your data factory pipelines.



The Discussion

  • User profile image
    Leo Scott

    Link don't work
    ie https://azfr/513/01 does not work

  • User profile image

    @Leo Scott: Thanks Leo - all fixed. Error between keyboard and chair.

  • User profile image

    can we run python azure functions in azure data factory? My current function (written in python) does not return any value; it saves the output in datalake (csv file). When I tried to run the function on data factory I got following error: { "errorCode": "3600", "message": "Error calling the endpoint.", "failureType": "UserError", "target": "Azure Functiontest".

  • User profile image

    Same error with javascript function using function key.

  • User profile image

    Same for us. C# function. interesting is that the function actually executes correctly but it still gives an error. Kind of a problem for production environment to determine if it really worked or not.

  • User profile image

    Having exactly same issue. Even by chaing return value to JObject type, still shows incorrect Failed message.

  • User profile image

    Hey guys,

    Sorry to hear that you are facing issues. I will need more information about your Azure Function Apps to investigate and see how I can help you with this. Please reach out to me at and we can look into this further

  • User profile image
    To make our .Net code in Azure Functions work for Azure Data Factory we did need to change the return type of the function. Here is my example for that:
  • User profile image

    Hi guys, in which part of the data factory can you extend the timeout-response feature? My AzureFunctionActivity runs for 3.8 minutes (230 sec) and fails. It shows "Error calling the endpoint", and other functions that are almost the same but shorter, run perfectly.

  • User profile image
    Jeremy Winchell

    Will we able to use an Azure Function as a DataSet for Copy Action activity? I have Azure functions that retrieve data from a more complex API where the HTTP connector will not work.


  • User profile image

    If you see 'Server Error' or '500 - The request timed out.', this probably means that your Azure Function is taking more than 230 seconds to run. This is a non-configurable timeout on the Azure Function/App Service side.

    The workaround for this is to use Durable functions, here is a useful link : The Durable function will return a 'statusQueryGetUri' that you can then call with a Web Activity to monitor the status and get the eventual output when the function completes execution.

  • User profile image

    @Matias the above comment should be helpful for your case.
    @Jeremy Winchell This is currently not possible, could you elaborate more on your use case? You can reach me at

    We have also improved the error messages surfaced from Azure Functions, they should help guide you in getting your functions working properly :)

  • User profile image
    @Jeremy Winchell: How did you solve this in the end? I have the following use case: I need to extract 10 000 records from the Google Audit API. I cannot use the REST, Odata or Web in ADF as they do not support Oauth2. I therefore have to use an Azure Function and the Google SDK to talk to the API. This returns 10 000 records which I need to write to my Data Lake Gen 2. There is no Data Lake Gen 2 SDK so I would like to return them to ADF so that ADF can use it's scale and connection to write to the lake. It looks like the function cannot return the 10k of records to ADF.

    Another option might be using Python with Google SDK on Databricks and letting Databricks write to the Data Lake.

    How did you solve this in the end?

Add Your 2 Cents