Skip to content

File Upload & Result Delivery

File Upload & Result Delivery

Execution Input / OutputMany real tasks do not end with a text reply. They need to receive files, process files, and return a result file.

This page focuses on the upload, processing, and download experience that customers actually touch.

ExecFabric already supports connecting input files, batch status, processing artifacts, and result download into one execution chain that customers can understand. The real point is not just whether a file can be uploaded. It is how the file is handled inside a controlled process and how the result comes back inside a delivery boundary.

Standard uploadLarge-file chunksBatch trackingResult download
EXECFABRIC // FILE FLOWDOC 09
const fileFlow = ['file_upload', 'batch_track', 'process', 'result_download']const largeFileFlow = ['init', 'chunk_transfer', 'merge_complete']const resultPolicy = 'controlled_result_delivery'
FILES IN / TRACEABLE PROCESS / RESULTS OUT

01

Standard upload

Fits common input files that can be submitted directly into a controlled upload flow before later processing or manual confirmation.

02

Large-file chunks

If the file is large, it can be uploaded in chunks so a transfer failure does not force the whole package to restart.

03

Batch status

A file does not disappear after upload. It enters a batch so later processing, tracking, and result matching stay clear.

04

Result download

If the task outputs a file, the platform can return that result file as a formal downloadable artifact.

End To End

How to read the full chain from upload to result

01. Submit the input file

The customer submits the input file through the upload entry opened for the current project. The exact entry depends on the delivery scope.

02. The file enters a controlled batch

The file first enters a controlled batch so later processing, tracking, and result matching remain clear.

03. Process it under project rules

The platform decides whether to process immediately, wait for an executor, or enter a longer business workflow according to the current project setup.

04. Review the batch and status

The user can directly see where this batch of files currently is in the process.

05. Generate the result artifact

If the task outputs Excel files, reports, summary files, or other formal results, the platform organizes them into a deliverable entry.

06. Download or continue using it

After the user gets the result file, they can download it, review it, or continue into the next business step.

Chat Binding

After a file is uploaded in chat execution, how does the script know which file to read

The script does not guess

The current design does not ask the script to scan a whole upload directory or guess from file names. It explicitly binds the batch number from this upload to the current chat execution.

The page entry is already consolidated

Users can now click Upload File in the quick area at the bottom of the chat execution view. It sits next to Upload Script, and the adjacent help icon explains that this batch will be bound to the current session.

The current batch is carried into execution

After upload, the current chat session stores this uploadBatchId. When the user confirms execution of a Skill, the backend passes the snapshot of that file batch into the execution.

The script receives the file snapshot for this run

At runtime the script gets the current batch number, input file list, and batch directory information. It processes the files bound to this confirmed execution, not a historical batch from somewhere else.

StageWhat happens nowWhy it matters
Upload fileThe frontend creates a controlled batchThe file enters a traceable batch first instead of being scattered across ad hoc uploads.
Trigger chat executionThe current session carries uploadBatchIdThis makes it explicit which input batch the execution should consume.
Confirm executionThe backend fills in a fileService snapshotThe executor receives the batch number, directory, and input-file list.
Run the scriptIt reads only the currently bound batchThis avoids accidentally reading another batch, another customer, or older files.
What the script can read directlyCurrent meaning
EXECFABRIC_UPLOAD_BATCH_NOThe currently bound batch number, used to decide which input batch this execution should consume.
EXECFABRIC_UPLOAD_INPUT_FILES_JSONThe input-file array for the current batch. The script can directly read fields such as fileName, fileId, and size.
EXECFABRIC_UPLOAD_FILE_SERVICE_JSONThe full file-service snapshot, including context such as batchNo and inputFiles.
EXECFABRIC_SKILL_INPUT_PAYLOAD_JSONThe full input payload for this execution, which also keeps uploadBatchId and fileService.

Both Python and Shell executors currently inject these variables into the runtime environment. If the user uploads a new batch of files and confirms execution again, the new batch replaces the old one. Scripts should always read inputs from the current execution context instead of relying on unstable rules such as "the latest file."

Fit

Which scenarios fit this path better

ScenarioWhy it fitsFit levelNotes
Upload raw files first, then generate reportsBoth the input and output are filesAlready a good fitFor example result sheets, summary tables, exports, or processed artifacts.
The file is large and a one-shot transfer may failA more stable upload method is neededAlready a good fitChunked upload is better than retrying the whole package.
You need to preserve the processing status of a batch of input filesIt is easier to track the batch and the resultAlready a good fitThe platform emphasizes the batch view instead of showing only "upload succeeded."
You only need text Q&A and no files are involvedThe file path is not the main concernNo need to force itThose scenarios should start with intelligent execution and the AI recommendation path.

Current Scope

Current fit boundary

  • Not every project opens an upload center by default. It depends on the current delivery scope.
  • If the project includes file processing, the platform can provide a file-upload and result-download chain.
  • If the project includes long-running tasks, the way results return will continue to be refined inside the project experience.
  • The file path is a business-data processing entry, not the same thing as a script hot-update entry.

Safety Rule

Safety boundaries that matter here

  • Customers can see only the file entry points and result artifacts opened for their own project.
  • Upload results and download entries are not shared across customers.
  • The platform prioritizes tenant boundaries, traceable results, and explainable delivery.
  • If the result involves stricter security requirements, visibility and download method are controlled by project boundary.

Next Read

Deliverables and onboarding materials

The file path is only one part of delivery. It is easier to evaluate a project completely when you read it together with customer flow, deliverables, and the onboarding checklist.

Crafting the unbreakable fabric of automation.