Skip to content

Analyze data

Run an analysis workflow

Use the following steps to run an analysis workflow:

  1. Create a new analysis table from either the project page:

    or from the Analyses panel on the dataset page:

  2. Select the analysis of choice using the popup that lists all analyses offered in IDEAS. Workflows are organized into toolkits, thus you can filter the list by toolkit. Click Create Table to create and navigate to the analysis table.

  3. Click Add Row or the icon to start adding data. Note that once you add a row, columns are auto-populated with default values. Double click the cell under Input Movie Files. This will open a panel where you can select a file. You can select any compatible file from any dataset or previously created analysis table from the current project. Note that you may delete rows by clicking the icon.

    Complete the table by modifying parameters as desired (click Enter to save each modification).

    Note the following:

    • Any cells highlighted in red must be filled in prior to continuing.
    • You can quickly navigate to a parameter using the Jump to column popup menu as shown below.

    • Some cells may have a icon when you hover over the cell indicating that you may fill the cell with information from the metadata.

  4. Configure your computing resources by navigating to the Compute Resources column and select the resources you want to use for the task. This allows you to tailor the resources more appropriately for smaller or larger input files. Take the following two examples:

    • Example 1: If you want to run CNMFe on a small (2 GB) input movie, you can decrease the compute credit expenditure by reducing the default requirements (8 vCPUs, 64 GB RAM, 2 credits/hour) to (4 vCPUs, 8 GB RAM, 0.5 credits/hour).
    • Example 2: If you want to run CNMFe on a large (100 GB) input movie, you can increase the compute requirements to accommodate (up to 256 GB of RAM is offered).

  5. Duplicate a row by clicking the Copy row icon. This may be useful if you want to run the same workflow using a range of different parameters to find the condition most optimal for your data.

  6. Select a Task ID if desired by selecting from the parameters available to use as an identifier. The default will be the time the task was started.

  7. Click Start All Tasks or the icon to begin the analysis. The number next to the button represents how many tasks will be executed. Once the task is initiated, the following status options will appear in the last column (in order of task completion):

    • Queued
    • Running
    • Complete

    You can click on the status (colored rectangle) to view the task log.

    Error Status

    If an error is encountered, the status will change to Failed.

  8. Click the icon to download the analysis table to a .csv file.

Task info and results

Information related to an analysis task can be found in the analysis table by clicking the Go to Task Info link.

You can view all tasks for the project by clicking the View All Tasks link on the project page.

The Tasks table lists all tasks executed or in progress including the date the task was run, user who executed the task, status, run-time duration, and compute credits consumed. You can click on the source link to navigate directly to the respective analysis table. The bottom of the table provides a summation of all compute credits consumed across all tasks.

Data assignment

By default, the results of the analysis will be attached to the corresponding recording. When you insert analysis results into a dataset table, if the Attach results toggle is enabled, the results will appear in the table and will automatically be associated with the correct recording. You can adjust whether results are visible in the dataset table by toggling the Attach results setting on/off. Alternatively, you may click the icons.

For example, let's say you ran CNMFe for TEST-1 twice (each using different parameters). But maybe only one of the results were good. You can choose to remove the 'bad' results from the dataset table by turning off the corresponding Attach results toggle.


You can see what dataset and recording the results are attached to by hovering over the toggle button.

Inspect task logs

If you click the task Log icon in the analysis table, a window pops open allowing you to view the task logs and the resource usage. The logs may be useful when an error is encountered when running a workflow.

Click the Resource Usage tab to view a table tracking the compute resources consumed. You will find min and max usage for CPU, RAM, and storage, as well as percentage used of what has been allocated. Use this table to guide selecting the appropriate resources when executing future tasks.

Preview results

From the analysis table, click Go to Results which will take you to the Analysis Results columns.


Click any file to preview.

Organize results with data

As mentioned above, the default setting automatically attaches the analysis results to the recording. Therefore, you can visualize the results from the dataset table.

  1. From the Insert menu, click Insert column and then select Analysis Result.

  2. If you have multiple analyses associated with a recording, you will have the option to select what analysis table from which to pull data. Also note that you can specify what results are inserted as shown in the example below.

    Click Add Column. You should now see your analysis results in the dataset table.

Tool versions

When new features are added or bugs are fixed for an analysis tool, a new version is released. You can view what tool version was used when running a task by looking at the Tool Version column within the Analysis Table as shown below.

Changes to a tool can be minor (e.g., changing the axis label of an output data preview) or major (e.g., changing the parameters that the user can modify or modifying the underlying logic of the analysis method).

IDEAS groups tool versions into tabs on the analysis table based on the following change types:

  • Minor Change

    • there is no change to the format of the analysis table (i.e., no new or removed parameters or outputs)
    • the changes are not meaningfully different enough to warrant creating a new version tab
  • Major Change

    • there is a change to the format of the analysis table (i.e., a column has been added or removed)
    • the changes are meaningfully different enough to warrant creating a new version tab

For example, the image below shows two tabs (one for the most recent versions 5.3.0 and one for versions 3.9.0 - 3.7.0). If there was a task with version 3.7.1, this would represent a minor change version compared to 3.9.0.

Note

Tool authors decide how to group versions in tabs per their discretion.

When running tasks, note the following:

  • Different minor versions can be run on the same analysis table tab.

    Select which minor version you want to execute when running the task from the Tool Version column as shown below. By default, the latest version within the version tab will be used.

  • Different major versions must be run on separate analysis table tabs.

    When a new major version is released, there will be an option to create a new tab on the analysis table with the new major version. For example, in the image below, previous analyses were run using versions 1.1.0. A new tab was then created for versions 2.2.2-1.3.0.