Skip to content

Background XML Generator & S3 Bucket File Upload#213

Open
arpandas-collab wants to merge 6 commits intoShopify:mainfrom
arpandas-collab:ad-feeds-s3
Open

Background XML Generator & S3 Bucket File Upload#213
arpandas-collab wants to merge 6 commits intoShopify:mainfrom
arpandas-collab:ad-feeds-s3

Conversation

@arpandas-collab
Copy link
Copy Markdown

Overview:
This PR introduces a native, self-contained background infinite process that runs continuously within the server environment. It is configured to execute a recurrent job every 5 minutes that spins up an XML payload containing dummy products, writes it locally, uploads it globally via S3, and safely clears its own footprint.

Key Changes:
Background Interval Job: Hooked startInfiniteProcess() in app/entry.server.tsx to fire off a continuous 5-minute recurrent interval natively on boot without blocking the main React server thread.
XML Generation: Dynamically spins up random dummy products using elapsed timestamps, structuring them into a well-formed XML string and writing them out to the persistent mounted volume at /data/products.xml. No 3rd party GraphQL queries currently required.

S3 Bucket Uploader: Added the @aws-sdk/client-s3 dependency and configured a PutObjectCommand to push the generated file safely up to the connected Railway S3-compatible Bucket automatically.

Ephemeral Cleanup (fs.unlinkSync): Once uploaded, the task gracefully evicts the file off the local volume mapping (/data/products.xml) so standard disk space is preserved.

Improved Logging: Embedded clear, tracing console.log statements heavily tagged with [File Job] so executions log clearly in observability environments (showing generation stats, target bucket destination, and confirmation of cleanup).

Documentation: Kept context.txt up to date with the precise architecture of the async background routine and the AWS SDK dependency.

Infrastructure / Required Env Variables
This feature expects the following credentials to be injected automatically (usually populated out-of-the-box by the Railway S3 Plugin):

BUCKET_NAME (or AWS_BUCKET_NAME)
AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY
AWS_ENDPOINT_URL_S3 (Required if using MinIO/non-AWS S3)
(The script will gracefully fail out and simply log "Skipped moving file" without crashing if bucket credentials are intentionally left out during local development.)

Validation:
npm run build TypeScript checks pass perfectly.
Validated fallbacks when S3 credentials don't exist.
Verified stream upload functionality handles the memory string allocation efficiently.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant