| ❗ Looking for Mainframe Data Utilities v1? ❗ |
|---|
- Security
- License
- About
- Status
- Requirements
- Limitations
- Download
- Examples
- Backlog
See CONTRIBUTING for more information.
This project is licensed under the Apache-2.0 License.
Mainframe Data Utilities is an AWS Sample written in Python.
The purpose of this project is to provide Python scripts as a starting point for those who need to read EBCDIC files transferred from mainframes and IBM i platforms on AWS or any distributed environment.
- File layouts defined inside Cobol programs are not supported.
- The file's logical record length is the sum of all field sizes. This means that in some cases the calculation may result in a size that is smaller than the physical file definition.
- The
REDEFINESstatement for data items, it's only supported for group items.
Download the code to run a preloaded example.
From Windows, Mac or Linux shell (including AWS CloudShell), clone this repo and change directory:
git clone https://github.com/aws-samples/mainframe-data-utilities.git mdu
cd mdu
There are some examples about how to extract data on different use cases:
| Document | Description |
|---|---|
| Single Layout FB file | The simplest conversion. Local, 'fixed blocked' and 'single layout' input file. |
| Read JSON metadata from Amazon S3 | The JSON metadata file read from S3. |
| Single Layout FB file | Convert a file using multithreading and generating multiple output files. |
| Single Layout VB file | Convert a Variable Block input file. |
| Multiple Layout file | Convert a multiple layout input file. |
| Read the input file from S3 | Get the input file from S3 and generate a local converted file. |
| Write the output file on S3 | Read a local file and write a converted file on S3. |
| Write the output data on DynamoDB | Read a local file and write its data on DynamoDB. |
| Convert files using a Lambda function | Use a Lambda function to read an EBCDIC file from S3 and write the converted file back to S3. |
| Convert files using S3 Object Lambda | Use an Object Lambda to convert a EBCDIC file while it's downloaded from S3. |
| Split files by content/key | Split an EBCDIC file according with a key provided |
| Discard specific layout | Convert a multiple layout input file while discarding selected record types |
- There are still some try / exceptions to be coded.
- Test automation.
- Code organization / refactoring.
- OCCURS DEPENDING ON copybook parsing.
- Data item REDEFINES.
- Aurora schema parser (DDL)
- Add similar packing statements (BINARY, PACKED-DECIMAL...)
- Handle packing statement (COMP, COMP-3, etc.) when declared before PIC statement
- Aurora data load