{:new_window: target="_blank"} {:shortdesc: .shortdesc} {:screen:.screen} {:codeblock:.codeblock} # Task title with gerund {: #servicename_task} *Last updated: nn month yyyy* {: .last-updated} Before you can start processing your {{site.data.keyword.Bluemix_notm}} application data with {{site.data.keyword.hadoop}}, you must first upload it to the {{site.data.keyword.hadoopst}} Hadoop Distributed File System (HDFS) file structure. {:shortdesc} Complete the following tasks to upload your data to the HDFS environment with the webHDFS REST API. ## Sub task title with gerund {: #servicename_subtask} To access the HDFS file system, you must connect as the `biblumix` user, so that you can access the `/user/biblumix` directory in HDFS. To find the `biblumix` user password, follow these steps: 1. Step 1 **Tip:** blah blah 2. Step 2. For example input: 3. ``` copyable code ``` {: codeblock} 3. Step 3. For example output: ``` displayed info ``` {: screen}