If Unomaly successfully receives the data, but the data does not look correct or as you expect it to, it may be an issue with the tokenization of the data.
Unomaly supports log messages at a maximum of 8192 bytes, including protocol headers. If a larger log message comes into Unomaly then Unomaly will not analyze the message beyond the first 8192 bytes.
Increase the number of tokens per event
Unomaly limits the amount of tokens that is needed to define structures for very long events, with the setting
max_events_objects in Settings > Advanced. If too many events with very long log lines are being recognized as the same type of events, you can increase this limit to increase the number of tokens that Unomaly will check for in an event.
Improve detection of structural anomalies
A common source of noise in Unomaly are structural anomalies created by large JSON objects in the log data. Improved processing of these structures and the depth limit for identifying nested structures helps the structural tokenizer to mitigate this noise.
By default, Unomaly collapses structures from the third level into a simple token. This depth limit, which is the number of levels Unomaly will into nested structures for parameters, is tunable per system. You can change the depth limit in the individual systems’s Settings > Advanced tab.
Did this article help you?
Thank you for the feedback!