Skip to content

Memory issue #206

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 28 commits into from
Apr 28, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
55 changes: 30 additions & 25 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -56,20 +56,21 @@ The basic method for using this library is, that you create a definition for you
- [File Entry Limitations](#file-entry-limitations)
- [File Size Limitations](#file-size-limitations)
- [Memory Limitations](#memory-limitations)
- [Exporting Anomalies Report](#exporting-anomalies-report)
- [Changelog](#changelog)
- [1.11.3 (In-Progress)](#1113-in-progress)
- [1.11.2](#1112)
- [1.11.0](#1110)
- [1.0.10](#1010)
- [1.0.8.2](#1082)
- [1.0.8](#108)
- [1.0.7](#107)
- [1.0.6](#106)
- [1.0.5](#105)
- [1.0.4](#104)
- [1.0.3](#103)
- [1.0.1](#101)
<!-- TOC -->
_ [1.11.3 (In-Progress)](#1113--in-progress-)
_ [1.11.2](#1112)
_ [1.11.0](#1110)
_ [1.0.10](#1010)
_ [1.0.8.2](#1082)
_ [1.0.8](#108)
_ [1.0.7](#107)
_ [1.0.6](#106)
_ [1.0.5](#105)
_ [1.0.4](#104)
_ [1.0.3](#103)
_ [1.0.1](#101)
<!-- TOC -->

## Installation

Expand Down Expand Up @@ -549,23 +550,23 @@ As of 1.11.3 we have introduced a series of guard rails. These allow you to cont

The following table lists all available guard rail properties and their default values:

| Property | Description | Affects | Scale | Default Value |
| ---------------------------------------- | ------------------------------------------------------ | ----------------------------------------------- | ---------- | ------------- |
| PROP_LOGPARSER_FILEENTRY_LIMIT | Maximum number of entries to parse per file | File parsing | Count | -1 (disabled) |
| PROP_LOGPARSER_FILESIZE_LIMIT | Maximum file size in MB to parse | File parsing | Megabytes | -1 (disabled) |
| PROP_LOGPARSER_HEAP_LIMIT | Maximum heap size increase in MB before warning | File parsing, FilterBy, Search, enrich, groupBy | Megabytes | -1 (disabled) |
| PROP_LOGPARSER_MEMORY_LIMIT_PERCENTAGE | Maximum percentage of memory usage before warning | File parsing, FilterBy, Search, enrich, groupBy | Percentage | -1 (disabled) |
| PROP_LOGPARSER_EXCEPTION_ON_MEMORY_LIMIT | Whether to throw exception when memory limits exceeded | Memory Checks | Boolean | false |
| Property | Description | Affects | Scale | Default Value |
| ----------------------------------- | ------------------------------------------------------ | ----------------------------------------------- | ---------- | ------------- |
| LOGPARSER_FILEENTRY_LIMIT | Maximum number of entries to parse per file | File parsing | Count | -1 (disabled) |
| LOGPARSER_FILESIZE_LIMIT | Maximum file size in MB to parse | File parsing | Megabytes | -1 (disabled) |
| LOGPARSER_HEAP_LIMIT | Maximum heap size increase in MB before warning | File parsing, FilterBy, Search, enrich, groupBy | Megabytes | -1 (disabled) |
| LOGPARSER_MEMORY_LIMIT_PERCENTAGE | Maximum percentage of memory usage before warning | File parsing, FilterBy, Search, enrich, groupBy | Percentage | -1 (disabled) |
| LOGPARSER_EXCEPTION_ON_MEMORY_LIMIT | Whether to throw exception when memory limits exceeded | Memory Checks | Boolean | false |

### File Entry Limitations

For whatever reason, you may want to set a limit on the number of entries you can extract from a file. This cab be done by setting the system property _PROP_LOGPARSER_FILEENTRY_LIMIT_ .
For whatever reason, you may want to set a limit on the number of entries you can extract from a file. This cab be done by setting the system property _LOGPARSER_FILEENTRY_LIMIT_ .

When set, the log parser stops parsing after reaching the limit in a file, and moves to the next file. Whenever this happens we log a WARNING and add the skipped file to our internal list of issues.

### File Size Limitations

For whatever reason, you may want to set a limit on the number of entries you can extract from a file. This cab be done by setting the system property _PROP_LOGPARSER_FILESIZE_LIMIT_ .
For whatever reason, you may want to set a limit on the number of entries you can extract from a file. This cab be done by setting the system property _LOGPARSER_FILESIZE_LIMIT_ .

When set, the we create a warning regarding the file size, and store it among the file size issues.

Expand All @@ -575,13 +576,17 @@ Although we will not stop a process from executing due to memory issues, we prov

These limitations are set with the following System properties:

- _PROP_LOGPARSER_HEAP_LIMIT_ : Setting a limit above which we log these occurences.
- _PROP_LOGPARSER_MEMORY_LIMIT_PERCENTAGE_ : A percentage of the occupied memory in reference to the max memory.
- _LOGPARSER_HEAP_LIMIT_ : Setting a limit above which we log these occurences.
- _LOGPARSER_MEMORY_LIMIT_PERCENTAGE_ : A percentage of the occupied memory in reference to the max memory.

We also have the possibility of throwing an exception iin the case of surpassing the memory rules. This is activated by setting the System property _PROP_LOGPARSER_EXCEPTION_ON_MEMORY_LIMIT_ to true.
We also have the possibility of throwing an exception iin the case of surpassing the memory rules. This is activated by setting the System property _LOGPARSER_EXCEPTION_ON_MEMORY_LIMIT_ to true.

You can also call the memory guard rails in your own implementation by calling `ParseGuardRails.checkMemoryLimits()`. This will check both heap and memory percentage limits.

### Exporting Anomalies Report

We have the possibility of exporting the anomalies report. This is done by calling the method `LogData#exportAnomaliesReport(String fileName)`. If you do not give an argument `LogData#exportAnomaliesReport()` will export the anomalies to a file called anomalies.json.

## Changelog

### 1.11.3 (In-Progress)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -37,6 +37,7 @@
*/
public class LogData<T extends StdLogEntry> {

public static final String STD_LOG_ERROR_ON_EMPTY_LOG_DATA = "No Log data to export. Please load the log data before re-attempting";
protected static Logger log = LogManager.getLogger();

/**
Expand Down Expand Up @@ -418,7 +419,7 @@ public File exportLogDataToCSV() throws LogDataExportToFileException {
.fetchEscapedTitle()
+ "-export.csv");
} else {
log.warn("No Log data to export. Please load the log data before re-attempting");
log.warn(STD_LOG_ERROR_ON_EMPTY_LOG_DATA);
return null;
}

Expand All @@ -437,7 +438,7 @@ public File exportLogDataToCSV(String in_fileName) {
if (l_firstEntry != null) {
return exportLogDataToCSV(l_firstEntry.fetchHeaders(), in_fileName);
} else {
log.warn("No Log data to export. Please load the log data before re-attempting");
log.warn(STD_LOG_ERROR_ON_EMPTY_LOG_DATA);
return null;
}
}
Expand Down Expand Up @@ -484,7 +485,7 @@ public File exportLogDataToHTML(String in_reportTitle, String in_htmlFileName) {
T l_firstEntry = this.fetchFirst();

if (l_firstEntry == null) {
log.error("No Log data to export. Please load the log data before re-attempting");
log.error(STD_LOG_ERROR_ON_EMPTY_LOG_DATA);
return null;
}
return exportLogDataToHTML(l_firstEntry.fetchHeaders(), in_reportTitle,
Expand Down Expand Up @@ -548,7 +549,7 @@ public File exportLogDataToJSON() throws LogDataExportToFileException {
return exportLogDataToJSON(l_firstEntry.fetchHeaders(),
l_firstEntry.getParseDefinition().fetchEscapedTitle() + "-export.json");
} else {
log.warn("No Log data to export. Please load the log data before re-attempting");
log.warn(STD_LOG_ERROR_ON_EMPTY_LOG_DATA);
return null;
}
}
Expand All @@ -565,7 +566,7 @@ public File exportLogDataToJSON(String in_jsonFileName) throws LogDataExportToFi
if (l_firstEntry != null) {
return exportLogDataToJSON(l_firstEntry.fetchHeaders(), in_jsonFileName);
} else {
log.warn("No Log data to export. Please load the log data before re-attempting");
log.warn(STD_LOG_ERROR_ON_EMPTY_LOG_DATA);
return null;
}
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,11 +9,13 @@
package com.adobe.campaign.tests.logparser.utils;

import java.io.File;
import java.io.IOException;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.Set;

import com.fasterxml.jackson.databind.ObjectMapper;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;

Expand All @@ -29,17 +31,17 @@ public class ParseGuardRails {
protected static Map<String, Long> heapLimitations = new HashMap<>();
protected static Map<String, Double> memoryLimitations = new HashMap<>();


public static final String ANOMALY_REPORT_PATH = "./logParserAnomalies.json";
public static long HEAP_SIZE_AT_START = MemoryUtils.getCurrentHeapSizeMB();
public static int FILE_ENTRY_LIMIT = Integer.parseInt(System.getProperty("PROP_LOGPARSER_FILEENTRY_LIMIT", "-1"));
public static long HEAP_LIMIT = Integer.parseInt(System.getProperty("PROP_LOGPARSER_HEAP_LIMIT", "-1"));
public static int FILE_ENTRY_LIMIT = Integer.parseInt(System.getProperty("LOGPARSER_FILEENTRY_LIMIT", "-1"));
public static long HEAP_LIMIT = Integer.parseInt(System.getProperty("LOGPARSER_HEAP_LIMIT", "-1"));
public static double MEMORY_LIMIT_PERCENTAGE = Double
.parseDouble(System.getProperty("PROP_LOGPARSER_MEMORY_LIMIT_PERCENTAGE", "-1"));
.parseDouble(System.getProperty("LOGPARSER_MEMORY_LIMIT_PERCENTAGE", "-1"));
protected static boolean EXCEPTION_ON_MEMORY_LIMIT = Boolean
.parseBoolean(System.getProperty("PROP_LOGPARSER_EXCEPTION_ON_MEMORY_LIMIT", "false"));
.parseBoolean(System.getProperty("LOGPARSER_EXCEPTION_ON_MEMORY_LIMIT", "false"));

protected static long FILE_SIZE_LIMIT = Long
.parseLong(System.getProperty("PROP_LOGPARSER_FILESIZE_LIMIT", "-1"));
.parseLong(System.getProperty("LOGPARSER_FILESIZE_LIMIT", "-1"));
protected static int MEASUREMENT_SCALE = 1024 * 1024;

public static void reset() {
Expand Down Expand Up @@ -172,12 +174,42 @@ private static boolean hasReachedFileSizeLimit(long length) {
public static Map<String, Set<String>> getAnomalyReport() {
Map<String, Set<String>> report = new HashMap<>();

report.put("heapLimitations", heapLimitations.keySet());
report.put("memoryLimitations", memoryLimitations.keySet());
report.put("fileSizeLimitations", fileSizeLimitations.keySet());
report.put("entryLimitations", entryLimitations.keySet());
Map.of(
"heapLimitations", heapLimitations,
"memoryLimitations", memoryLimitations,
"fileSizeLimitations", fileSizeLimitations,
"entryLimitations", entryLimitations).forEach((key, map) -> {
if (!map.isEmpty()) {
report.put(key, map.keySet());
}
});

return report;
}

/**
* Exports the anomaly report to a JSON file
* The file will be created if it doesn't exist, or replaced if it does
* Only exports if there are anomalies to report
*/
public static void exportAnomalyReport() {
exportAnomalyReport(ANOMALY_REPORT_PATH);
}

/**
* Exports the anomaly report to a JSON file at the specified path
*
* @param filePath The path where to save the anomaly report
*/
public static void exportAnomalyReport(String filePath) {
Map<String, Set<String>> report = getAnomalyReport();
if (!report.isEmpty()) {
try {
ObjectMapper mapper = new ObjectMapper();
mapper.writeValue(new File(filePath), report);
} catch (IOException e) {
log.error("Failed to export anomaly report to {}", filePath, e);
}
}
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -12,31 +12,43 @@
import static org.hamcrest.Matchers.is;
import static org.testng.Assert.assertThrows;

import java.io.File;
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Path;
import java.util.List;
import java.util.Map;
import java.util.Set;

import org.hamcrest.Matchers;
import org.testng.annotations.AfterMethod;
import org.testng.annotations.BeforeMethod;
import org.testng.annotations.Test;

import com.adobe.campaign.tests.logparser.exceptions.MemoryLimitExceededException;
import com.fasterxml.jackson.databind.ObjectMapper;

public class ParseGuardRailsTest {

private Path tempFile;
private static final String ANOMALY_REPORT_PATH = ParseGuardRails.ANOMALY_REPORT_PATH;

@BeforeMethod
public void setup() throws IOException {
ParseGuardRails.reset();

System.clearProperty("EXCEPTION_ON_MEMORY_LIMIT");
tempFile = Files.createTempFile("test", ".log");
// Clean up any existing anomaly report
new File(ANOMALY_REPORT_PATH).delete();
}

@AfterMethod
public void cleanup() throws IOException {
ParseGuardRails.reset();
System.clearProperty("EXCEPTION_ON_MEMORY_LIMIT");
Files.deleteIfExists(tempFile);
// Clean up the anomaly report
new File(ANOMALY_REPORT_PATH).delete();
}

@Test
Expand Down Expand Up @@ -135,19 +147,18 @@ public void testCheckMemoryLimits_WhenHeapLimitReached() {
assertThat("Should reach limit when heap limit is reached",
ParseGuardRails.checkMemoryLimits(), is(true));

assertThat("Should have anomaly report", ParseGuardRails.getAnomalyReport().size(), is(4));
assertThat("Should have anomaly report", ParseGuardRails.getAnomalyReport().size(), is(1));
assertThat("Should have heap limitation", ParseGuardRails.getAnomalyReport().get("heapLimitations").size(),
is(1));

assertThat("Should have memory limitation", ParseGuardRails.getAnomalyReport().get("memoryLimitations").size(),
is(0));
assertThat("Should not have a memory limitation",
!ParseGuardRails.getAnomalyReport().containsKey("memoryLimitations"));

assertThat("Should have file size limitation",
ParseGuardRails.getAnomalyReport().get("fileSizeLimitations").size(),
is(0));
!ParseGuardRails.getAnomalyReport().containsKey("fileSizeLimitations"));

assertThat("Should have entry limitation", ParseGuardRails.getAnomalyReport().get("entryLimitations").size(),
is(0));
assertThat("Should have entry limitation",
!ParseGuardRails.getAnomalyReport().containsKey("entryLimitations"));
}

@Test
Expand Down Expand Up @@ -179,4 +190,76 @@ public void testCheckMemoryLimits_WhenMemoryLimitReachedWithException() {
ParseGuardRails.MEMORY_LIMIT_PERCENTAGE = 0.0; // Set limit to 0% to force reaching it
assertThrows(MemoryLimitExceededException.class, () -> ParseGuardRails.checkMemoryLimits());
}

@Test
public void testExportAnomalyReport_WhenNoAnomalies() {
ParseGuardRails.exportAnomalyReport();
assertThat("Should not create file when no anomalies",
(new File(ANOMALY_REPORT_PATH)).exists(), is(false));
}

@Test
public void testExportAnomalyReport_WhenHasAnomalies() throws IOException {
// Create some anomalies
ParseGuardRails.HEAP_LIMIT = 1;
ParseGuardRails.HEAP_SIZE_AT_START = -20;
ParseGuardRails.checkMemoryLimits();

ParseGuardRails.exportAnomalyReport();

// Verify file exists
File reportFile = new File(ANOMALY_REPORT_PATH);
assertThat("Should create file when anomalies exist",
reportFile.exists(), is(true));

// Verify content
ObjectMapper mapper = new ObjectMapper();
Map<String, List<String>> report = mapper.readValue(reportFile, Map.class);
assertThat("Should have heap limitations",
report.containsKey("heapLimitations"));
assertThat("Should have at least one heap limitation",
report.get("heapLimitations").size(), Matchers.greaterThan(0));
}

@Test
public void testExportAnomalyReport_WhenFileExists() throws IOException {
// Create initial file
ObjectMapper mapper = new ObjectMapper();
Map<String, Set<String>> initialData = Map.of("test", Set.of("data"));
mapper.writeValue(new File(ANOMALY_REPORT_PATH), initialData);

// Create some anomalies
ParseGuardRails.HEAP_LIMIT = 1;
ParseGuardRails.HEAP_SIZE_AT_START = -20;
ParseGuardRails.checkMemoryLimits();

ParseGuardRails.exportAnomalyReport();

// Verify file was replaced
Map<String, Set<String>> report = mapper.readValue(new File(ANOMALY_REPORT_PATH), Map.class);
assertThat("Should have replaced old content",
report.containsKey("heapLimitations"), is(true));
assertThat("Should not have old content",
report.containsKey("test"), is(false));
}

@Test
public void testExportAnomalyReport_WhenIOExceptionOccurs() throws IOException {
// Create some anomalies
ParseGuardRails.HEAP_LIMIT = 1;
ParseGuardRails.HEAP_SIZE_AT_START = -20;
ParseGuardRails.checkMemoryLimits();

// Create a file that will cause an IOException when trying to write
File reportFile = new File(ANOMALY_REPORT_PATH);
reportFile.createNewFile();
reportFile.setReadOnly();

// This should log an error but not throw an exception
ParseGuardRails.exportAnomalyReport();

// Clean up
reportFile.setWritable(true);
reportFile.delete();
}
}