Project Management Commands: CreateProject(name="...", desc="...", templateUri="...", driver=[Pg|mysql], authorizationToken="..."); - create a new project on the server - name - name of the new project - authorizationToken - project creation authorization token (can be passed via the -a commandline parameter) - desc - (optional) project description - templateUri - (optional) project template to create the project from - driver - (optional) underlying DB backend: Pg|mysql DeleteProject(id="..."); - drop the project on the server - id - (optional) project id, if not specified, the command tries to drop the current project OpenProject(id="..."); - open an existing project for data modeling and data upload. - id - identifier of an existing project (takes the form of an MD5 hash) RememberProject(fileName="..."); - saves the current project identifier into the specified file - fileName - file to save the project identifier UseProject(fileName="..."); - loads the current project identifier from the specified file - fileName - file to load the project identifier from InviteUser(email="...", msg="...", role="..."); - invites a new user to the project (must call CreateProject or OpenProject before) - email - the invited user's e-mail - msg - (optional) invitation message - role - (optional) initial user's role: admin|adminRole|editor|editorRole|dashboard only|dashboardOnlyRole|readonly|readonlyUserRole CreateUser(domain="...", username="...", password="...", firstName="...", lastName="...", company="...", phone="...", country="...", position="...", ssoProvider="...", usersFile="...", append="..."); - creates a new user. - domain - the GoodData users domain. The domain needs to be created by GoodData admins and associated with your GoodData account - username - the new user's username - password - the new user's password - firstName - the new user's first name - lastName - the new user's last name - company - (optional) the new user's company name - phone - (optional) the new user's phone - country - (optional) the new user's country (e.g. 'cz' or 'us;ca') - position - (optional) the new user's position - ssoProvider - (optional) the new user's SSO provider (e.g. SALESFORCE) - usersFile - (optional) writes the user's URI to the specified file - append - (optional) should the users file be appended (default is false) GetProjectUsers(usersFile="...", field = "...", activeOnly=); - get list of users from the current project - usersFile - writes the user's URI to the specified file - field - uri | email - writes either user uri or e-mail to the usersFile - activeOnly - (optional) lists only active users (not disabled) the false is default AddUsersToProject(usersFile="...", role="...") - adds users in the usersFile to the open project in a specific role - usersFile - the list of user URIs in a file - role - (optional) initial user's role: admin|adminRole|editor|editorRole|dashboard only|dashboardOnlyRole|readonly|readonlyUserRole DisableUsersInProject(usersFile="...") - disables users in the usersFile in the open project - usersFile - the list of user URIs in a file ExportProject(tokenFile="...", exportUsers="...", exportData="...", authorizedUsers="..."); - exports an existing project to temporary storage and returns the import token - tokenFile - a file where the import token will be stored - exportUsers - export existing project users true | false - exportData - export existing project data true | false - authorizedUsers - (optional) comma separated list of valid GoodData users who can import the project ImportProject(tokenFile="..."); - imports a previously exported project content identified by the import token to a new empty project - tokenFile - a file where the import token will be stored Metadata Management Commands: Important: All the commands in this section expect to know what project to work in already. You must call CreateProject, OpenProject or RetrieveProject in your script at some place before these commands. ExportMetadataObjects(tokenFile="...", objectIDs=""); - exports metadata objects with all dependencies, the attributes and facts must be created via ExecuteMaql - tokenFile - a file where the import token will be stored - objectIDs - the comma separated list of metadata object IDs ImportMetadataObjects(tokenFile="...", overwrite="", updateLDM=""); - imports metadata objects from the token, the attributes and facts must be created via ExecuteMaql - tokenFile - a file with a valid import token - updateLDM - if true the attributes and facts names and descriptions are updated RetrieveMetadataObject(id="...", file="..."); - retrieves a metadata object and stores it in a file - id - valid object id (integer number) - file - file where the object content (JSON) is going to be stored StoreMetadataObject(file="...", id="..."); - stores a metadata object with a content (JSON) in file to the metadata server - file - file where the object content (JSON) is stored - id - (optional) valid object id (integer number), if the id is specified, the object is going to be modified, if not, a new object is created DropMetadataObject(id="..."); - drops the object with specified id from the project's metadata - id - valid object id (integer number) Lock(path="..."); - prevents concurrent run of multiple instances sharing the same lock file. Lock files older than 1 hour are discarded. - path - path to a lock file MigrateDatasets(configFiles="..."); - migrates the project's datasets from CL 1.1.x to CL 1.2.x - configFiles - the comma separated list of ALL project's dataset's XML configuration files GenerateManifests(configFiles="...", dir="..."); - Generates the SLI manifests for specified XML config files - configFiles - the comma separated list of project's dataset's XML configuration files - dir - the target dir where the JSON SLI manifests are going to be stored Logical Model Management Commands: Important: All the commands in this section expect to know what project to work in and how your data model looks. You must call one of (CreateProject, OpenProject or RetrieveProject) commands and a Use command in your script at some place before these commands. GenerateMaql(maqlFile="..."); - generate MAQL DDL script describing data model from the local config file - maqlFile - path to MAQL file (will be overwritten) GenerateUpdateMaql(maqlFile="..."); - generate MAQL DDL alter script that creates the columns available in the local configuration but missing in the remote GoodData project - maqlFile - path to MAQL file (will be overwritten) ExecuteMaql(maqlFile="...", ifExists="..."); - run MAQL DDL script on server to generate data model - maqlFile - path to the MAQL file (relative to PWD) - ifExists - (optional) if set to true the command quits silently if the maqlFile does not exist (true | false, default is false) Data Commands: Important: All the commands in this section expect to know what project to work in and know the data model and data. You must call one of (CreateProject, OpenProject or RetrieveProject) commands and a Use command in your script at some place before these commands. TransferData(incremental="...", waitForFinish="..."); - upload data to the GoodData server - incremental - (optional) when true, will try to append (or merge/replace via matching CONNECTION_POINT) the data. (true | false, default is false) - waitForFinish - (optional) the process waits for the server-side processing (true | false, default is true) Dump(csvFile="..."); - dumps the connector data to a local CSV file - csvFile - path to the CSV file ExecuteDml(maql="..."); - executes a MAQL DML command (e.g. DELETE) - maql - the maqlDML command Data Connectors: The following commands provide data connectors for consuming data from flat files (CSV Connector), databases (JDBC Connector) and even SaaS apps (Salesforce, Google Analytics, PivotalTracker Connectors). All connectors provide commands like GenerateConfig, Use, ExportToCsv. Data input should be encoded in UTF-8. CSV Connector Commands: GenerateCsvConfig(csvHeaderFile="...", configFile="...", defaultLdmType="...", facts="...", folder="...", separator="..."); - generate a sample XML config file based on the fields from your CSV file. If the config file exists already, only new columns are added. The config file must be edited as the LDM types (attribute | fact | label etc.) are assigned randomly. - csvHeaderFile - path to CSV file (only the first header row will be used) - configFile - path to configuration file (will be overwritten) - defaultLdmType - (optional) LDM mode to be associated with new columns (only ATTRIBUTE mode is supported by the ProcessNewColumns task at this time) - facts - (optional) comma separated list of fields known to be facts - folder - (optional) folder where to place new attributes - separator - (optional) field separator, the default is ',' UseCsv(csvDataFile="...", configFile="...", hasHeader="...", separator = "..."); - load CSV data file using config file describing the file structure, must call CreateProject or OpenProject before - csvDataFile - path to CSV datafile - configFile - path to XML configuration file (see the GenerateCsvConfig command that generates the config file template) - hasHeader - (optional) true if the CSV file has a header row (true | false, default is true) - separator - (optional) field separator, the default is ','. Use '\t' or type the tab char for tabulator. GoogleAnalytics Connector Commands: GenerateGoogleAnalyticsConfig(name="...", configFile="...", dimensions="...", metrics="..."); - generate an XML config file based on the fields from your GA query. - name - the new dataset name - configFile - path to configuration file (will be overwritten) - dimensions - pipe (|) separated list of Google Analytics dimensions (see http://code.google.com/apis/analytics/docs/gdata/gdataReferenceDimensionsMetrics.html) - metrics - pipe (|) separated list of Google Analytics metrics (see http://code.google.com/apis/analytics/docs/gdata/gdataReferenceDimensionsMetrics.html) UseGoogleAnalytics(configFile="...", username="...", password="...", profileId="...", dimensions="...", metrics="...", startDate="...", endDate="...", filters="..."); - load GA data file using config file describing the file structure, must call CreateProject or OpenProject before - configFile - path to configuration file (will be overwritten) - token - Google Analytics AuthSub token (you must specify either the token or username/password) - username - Google Analytics username (you must specify either the token or username/password) - password - Google Analytics password (you must specify either the token or username/password) - profileId - Google Analytics profile ID (this is a value of the id query parameter in the GA url) - dimensions - pipe (|) separated list of Google Analytics dimensions (see http://code.google.com/apis/analytics/docs/gdata/gdataReferenceDimensionsMetrics.html) - metrics - pipe (|) separated list of Google Analytics metrics (see http://code.google.com/apis/analytics/docs/gdata/gdataReferenceDimensionsMetrics.html) - startDate - the GA start date in the yyyy-mm-dd format - endDate - the GA end date in the yyyy-mm-dd format - filters - the GA filters (see http://code.google.com/apis/analytics/docs/gdata/gdataReferenceDataFeed.html#filters) JDBC Connector Commands: GenerateJdbcConfig(name="...", configFile="...", driver="...", url="...", query="...", username="...", password="..."); - generate an XML config file based on the fields from your JDBC query. - name - the new dataset name - configFile - path to configuration file (will be overwritten) - driver - JDBC driver string (e.g. "org.apache.derby.jdbc.EmbeddedDriver"), you'll need to place the JAR with the JDBC driver to the lib subdirectory - url - JDBC url (e.g. jdbc:derby:mydb) - query - SQL query (e.g. SELECT employee,dept,salary FROM payroll) - username - (optional) JDBC username - password - (optional) JDBC password UseJdbc(configFile="...", driver="...", url="...", query="...", username="...", password="..."); - load JDBC data file using config file describing the file structure, must call CreateProject or OpenProject before - configFile - path to configuration file (will be overwritten) - driver - JDBC driver string (e.g. "org.apache.derby.jdbc.EmbeddedDriver"), you'll need to place the JAR with the JDBC driver to the lib subdirectory - url - JDBC url (e.g. "jdbc:derby:mydb") - query - SQL query (e.g. "SELECT employee,dept,salary FROM payroll") - queryFile - a file that contains the SQL query (e.g. "SELECT employee,dept,salary FROM payroll") - username - (optional) JDBC username - password - (optional) JDBC password ExportJdbcToCsv(dir="...", driver="...", url="...", username="...", password="..."); - exports all tables from the database to CSV file - dir - target directory - driver - JDBC driver string (e.g. "org.apache.derby.jdbc.EmbeddedDriver"), you'll need to place the JAR with the JDBC driver to the lib subdirectory - url - JDBC url (e.g. "jdbc:derby:mydb") - username - (optional) JDBC username - password - (optional) JDBC password SalesForce Connector Commands: GenerateSfdcConfig(name="...", configFile="...", query="...", username="...", password="...", token="..."); - generate an XML config file based on the fields from your SFDC query. - name - the new dataset name - configFile - path to configuration file - query - SOQL query (e.g. "SELECT Id, Name FROM Account"), see http://www.salesforce.com/us/developer/docs/api/Content/data_model.htm - username - SFDC username - password - SFDC password - token - SFDC security token (you may append the security token to the password instead using this parameter) - partnerId - SFDC client ID (partner token) that allows extended access to the SalesForce API UseSfdc(configFile="...", query="...", username="...", password="...", token="..."); - load SalesForce data file using config file describing the file structure, must call CreateProject or OpenProject before - configFile - path to configuration file (will be overwritten) - query - SOQL query (e.g. "SELECT Id, Name FROM Account"), see http://www.salesforce.com/us/developer/docs/api/Content/data_model.htm - username - SFDC username - password - SFDC password - token - SFDC security token (you may append the security token to the password instead using this parameter) - partnerId - SFDC client ID (partner token) that allows extended access to the SalesForce API MS CRM 2011 Online Connector Commands: UseMsCrm(configFile="...", username="...", password="...", host="...", org="...", entity="...", fields="..."); - load MS CRM 2011 data file using config file describing the file structure, must call CreateProject or OpenProject before - configFile - path to configuration file (will be overwritten) - username - MS CRM username - password - MS CRM password - host - MS CRM server hostname - org - MS CRM organization name - entity - MS CRM entity name (e.g. account, opportunity etc.) - fields - MS CRM entity fields (e.g. accountid, name etc.) Sugar CRM Connector Commands: UseSugarCrm(configFile="...", username="...", password="...", host="...", entity="...", fields="..."); - load Sugar CRM data file using config file describing the file structure, must call CreateProject or OpenProject before - configFile - path to configuration file (will be overwritten) - username - Sugar CRM username - password - Sugar CRM password - host - Sugar CRM server hostname - entity - Sugar CRM entity name (e.g. account, opportunity etc.) - fields - Sugar CRM entity fields (e.g. id, name etc.) Chargify Commands: UseChargify(configFile="...", apiKey="...", domain="...", entity="...", fields="..."); - load Chargify data file using config file describing the file structure, must call CreateProject or OpenProject before - configFile - path to configuration file (will be overwritten) - apiKey - Chargify API key - domain - Chargify domain - entity - Chargify entity name (e.g. products, subscriptions etc.) - fields - Chargify entity fields (e.g. id, name etc.) Pivotal Tracker Connector Commands: UsePivotalTracker(username="...", password="...", pivotalProjectId="...", storyConfigFile="...", labelConfigFile="...", labelToStoryConfigFile="...", snapshotConfigFile="..."); - downloads and transforms the PT data. - username - PT username - password - PT password - pivotalProjectId - PT project ID (integer) - storyConfigFile - PT stories XML schema config - labelConfigFile - PT labels XML schema config - labelToStoryConfigFile - PT labels to stories XML schema config - snapshotConfigFile - PT snapshots XML schema config Time Dimension Connector Commands: UseDateDimension(name="...", includeTime = "...", type = "..."); - load new time dimension into the project, must call CreateProject or OpenProject before - name - the time dimension name differentiates the time dimension form others. This is typically something like "closed", "created" etc. - includeTime - generate the time dimension (true | false) - type - specifies the name of a particular fiscal date dimension