E9.2 Groovy Write to file gets stuck

dtujo2022

dtujo2022

Member
I have an Orchestration that retrieve a large amount of data from a Data Request (F4102) and passes that raw data set to a Custom Groovy Script.

1714771151542.png

The Orchestration will run I can watch the text file it writes to grow up until it hits roughly 2050 lines in the text file and then the run orchestration spinner just sits there with no more progress.
I have tried implementing the script several different ways using buffering and appending line by line. When using line by line I can see the text file size increment by KB up until roughly 950 kb and then just stop. Can't figure out why its hanging up.

Attached is two version of Groovy that I have tried.
 

Attachments

  • script1 (1).txt
    7.7 KB · Views: 6
  • script1 (2).txt
    7.8 KB · Views: 3
Last edited:
Here is code for another Orchestration where I do the same thing with different data and does the full amount of 1200 + kb and almost 8k lines

Java:
import com.oracle.e1.common.OrchestrationAttributes;
import java.text.SimpleDateFormat;
import groovy.json.*

HashMap<String, Object> main(OrchestrationAttributes orchAttr, HashMap inputMap)
{
  HashMap<String, Object> returnMap = new HashMap<String, Object>();
    // Get Environment and Path
    String env = inputMap.get("Environment");
    String filename = "InventoryFeeder.txt";
    String path;
    if(env =="JDV920"){
        path = inputMap.get("DVPath");
    } else if (env == "JPY920"){
        path = inputMap.get("PYPath");
    } else if (env == "JPD920"){
        path = inputMap.get("PDPath");
    } else {
        path = inputMap.get("DVPath");
    }
    // Build Complete Path
    path = path + filename;
    
    // Get Json String
    String InventoryArrayString = (String)inputMap.get("jsonStr");
    
    // Build Json Object from String
    def jsonSlurper = new JsonSlurper();
    def InventoryObject = jsonSlurper.parseText(InventoryArrayString);
    
    // Get Inventory Array
    def InventoryArray = InventoryObject.Inventory;
    
    // Delete Current File
    File deleteFile = new File(path);
    boolean delete = deleteFile.delete();
    
    // Create Write and Iterate over Inventory Data adding to new line
    new File(path).withWriter{writer ->
        String headerLine = "LOTKEY\tBusinessUnit\tHPKEY\tSecondItemNumber\tLocation\tQuantityOnHand\tQtyHardCommitted\tQtyOnHand\tQtyAvailable\tPieceDescription\tLotStatusCode\tIngotLetter\t";
        headerLine = headerLine + "Heat\tUsableLength\tUsableLengthUOM\tIngotDescrCode\tMajorDiamHeight\tMinorDiamWidth\tDimensionUOM\tShapeCode\tOrigMajorHeight\tOrigMinorWidth\tOriginalUOM\t";
        headerLine = headerLine + "OrigShapeCode\t MillId\tMeasuredUsableLength\tOrigIngotYield\tIngotTheoreticalYield\tTopBottomIndicator";
        writer.writeLine headerLine;
        InventoryArray.each{x ->
        String line = x.Lot + "\t" + x.CostCenter + "\t" + x.Y55UserDefString2 + "\t" + x.Identifier2ndItem + "\t" + x.Location + "\t" + x.QtyOnHandPrimaryUn + "\t" + x.QuantityOnWorkorder + "\t";
        line = line + x.QtyOnHandInSeconda + "\t" + x.UnitsQuantityAvailable + "\t" + x.DescriptionLot + "\t" + x.LotStatusCode + "\t" + x.MemoLotField1 + "\t" + x.MemoLotField2 + "\t" + x.Y55UsableLengh + "\t";
        line = line + x.Y55UsableLenghUOM + "\t" + x.Y55IngotDescCode + "\t" + x.Y55MajorDiOrHeight + "\t" + x.Y55MinorDiOrWidth + "\t" + x.Y55DimensionUOM + "\t" + x.Y55ShapeCode + "\t" + x.Y55OrigMajDiOrHeight + "\t";
        line = line + x.Y55OrigMinDiOrWidth + "\t" + x.Y55OrigDimUOM + "\t" + x.Y55OrigShapeCode + "\t" + x.Y55MillID + "\t" + x.Y55UserDefNum4 + "\t" + x.Y55UserDefNum3 + "\t" + x.Y55UserDefNum1 + "\t" + x.Y55UserDefCode4;
        writer.writeLine line;
        }
    }

  return returnMap;
}
 
I have had challenges with writing large datasets to a text file like this. The only thing I can suggest is instead of parsing the data request and writing the file out all at once, try iterating over the array and appending one line at a time.
 
I have had challenges with writing large datasets to a text file like this. The only thing I can suggest is instead of parsing the data request and writing the file out all at once, try iterating over the array and appending one line at a time.
For probably 90k records I can't imagine how slow this will be....
 
Back
Top Bottom