repo_name
stringclasses
28 values
pr_number
int64
8
3.71k
pr_title
stringlengths
3
107
pr_description
stringlengths
0
60.1k
author
stringlengths
4
19
date_created
unknown
date_merged
unknown
previous_commit
stringlengths
40
40
pr_commit
stringlengths
40
40
query
stringlengths
5
60.1k
filepath
stringlengths
7
167
before_content
stringlengths
0
103M
after_content
stringlengths
0
103M
label
int64
-1
1
log4js-node/log4js-node
1,334
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately
lamweili
"2022-10-01T18:55:40Z"
"2022-10-01T18:57:33Z"
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
7ca308dfde78ffc3ece68b77e30107590c61dc12
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately.
./docs/layouts.md
# Layouts Layouts are functions used by appenders to format log events for output. They take a log event as an argument and return a string. Log4js comes with several appenders built-in, and provides ways to create your own if these are not suitable. For most use cases you will not need to configure layouts - there are some appenders which do not need layouts defined (for example, [logFaces-UDP](https://github.com/log4js-node/logFaces-UDP)); all the appenders that use layouts will have a sensible default defined. ## Configuration Most appender configuration will take a field called `layout`, which is an object - typically with a single field `type` which is the name of a layout defined below. Some layouts require extra configuration options, which should be included in the same object. ## Example ```javascript log4js.configure({ appenders: { out: { type: "stdout", layout: { type: "basic" } } }, categories: { default: { appenders: ["out"], level: "info" } }, }); ``` This configuration replaces the [stdout](stdout.md) appender's default `coloured` layout with `basic` layout. # Built-in Layouts ## Basic - `type` - `basic` Basic layout will output the timestamp, level, category, followed by the formatted log event data. ## Example ```javascript log4js.configure({ appenders: { out: { type: "stdout", layout: { type: "basic" } } }, categories: { default: { appenders: ["out"], level: "info" } }, }); const logger = log4js.getLogger("cheese"); logger.error("Cheese is too ripe!"); ``` This will output: ``` [2017-03-30 07:57:00.113] [ERROR] cheese - Cheese is too ripe! ``` ## Coloured - `type` - `coloured` (or `colored`) This layout is the same as `basic`, except that the timestamp, level and category will be coloured according to the log event's level (if your terminal/file supports it - if you see some weird characters in your output and no colour then you should probably switch to `basic`). The colours used are: - `TRACE` - 'blue' - `DEBUG` - 'cyan' - `INFO` - 'green' - `WARN` - 'yellow' - `ERROR` - 'red' - `FATAL` - 'magenta' ## Message Pass-Through - `type` - `messagePassThrough` This layout just formats the log event data, and does not output a timestamp, level or category. It is typically used in appenders that serialise the events using a specific format (e.g. [gelf](https://github.com/log4js-node/gelf)). ## Example ```javascript log4js.configure({ appenders: { out: { type: "stdout", layout: { type: "messagePassThrough" } }, }, categories: { default: { appenders: ["out"], level: "info" } }, }); const logger = log4js.getLogger("cheese"); const cheeseName = "gouda"; logger.error("Cheese is too ripe! Cheese was: ", cheeseName); ``` This will output: ``` Cheese is too ripe! Cheese was: gouda ``` ## Dummy - `type` - `dummy` This layout only outputs the first value in the log event's data. It was added for the [logstashUDP](https://github.com/log4js-node/logstashUDP) appender, and I'm not sure there's much use for it outside that. ## Example ```javascript log4js.configure({ appenders: { out: { type: "stdout", layout: { type: "dummy" } } }, categories: { default: { appenders: ["out"], level: "info" } }, }); const logger = log4js.getLogger("cheese"); const cheeseName = "gouda"; logger.error("Cheese is too ripe! Cheese was: ", cheeseName); ``` This will output: ``` Cheese is too ripe! Cheese was: ``` ## Pattern - `type` - `pattern` - `pattern` - `string` - specifier for the output format, using placeholders as described below - `tokens` - `object` (optional) - user-defined tokens to be used in the pattern ## Pattern format The pattern string can contain any characters, but sequences beginning with `%` will be replaced with values taken from the log event, and other environmental values. Format for specifiers is `%[padding].[truncation][field]{[format]}` - padding and truncation are optional, and format only applies to a few tokens (notably, date). Both padding and truncation values can be negative. - Positive truncation - truncate the string starting from the beginning - Negative truncation - truncate the string starting from the end of the string - Positive padding - left pad the string to make it this length, if the string is longer than the padding value then nothing happens - Negative padding - right pad the string to make it this length, if the string is longer than the padding value then nothing happens To make fixed-width columns in your log output, set padding and truncation to the same size (they don't have to have the same sign though, you could have right truncated, left padded columns that are always 10 characters wide with a pattern like "%10.-10m"). e.g. %5.10p - left pad the log level by up to 5 characters, keep the whole string to a max length of 10. So, for a log level of INFO the output would be " INFO", for DEBUG it would be "DEBUG" and for a (custom) log level of CATASTROPHIC it would be "CATASTROPH". Fields can be any of: - `%r` time in toLocaleTimeString format - `%p` log level - `%c` log category - `%h` hostname - `%m` log data - `%d` date, formatted - default is `ISO8601`, format options are: `ISO8601`, `ISO8601_WITH_TZ_OFFSET`, `ABSOLUTETIME`, `DATETIME`, or any string compatible with the [date-format](https://www.npmjs.com/package/date-format) library. e.g. `%d{DATETIME}`, `%d{yyyy/MM/dd-hh.mm.ss}` - `%%` % - for when you want a literal `%` in your output - `%n` newline - `%z` process id (from `process.pid`) - `%f` full path of filename (requires `enableCallStack: true` on the category, see [configuration object](api.md)) - `%f{depth}` path's depth let you chose to have only filename (`%f{1}`) or a chosen number of directories - `%l` line number (requires `enableCallStack: true` on the category, see [configuration object](api.md)) - `%o` column postion (requires `enableCallStack: true` on the category, see [configuration object](api.md)) - `%s` call stack (requires `enableCallStack: true` on the category, see [configuration object](api.md)) - `%C` class name (requires `enableCallStack: true` on the category, see [configuration object](api.md) and [#1316](https://github.com/log4js-node/log4js-node/pull/1316)) - `%M` method or function name (requires `enableCallStack: true` on the category, see [configuration object](api.md) and [#1316](https://github.com/log4js-node/log4js-node/pull/1316)) - `%A` method or function alias (requires `enableCallStack: true` on the category, see [configuration object](api.md) and [#1316](https://github.com/log4js-node/log4js-node/pull/1316)) - `%F` fully qualified caller name (requires `enableCallStack: true` on the category, see [configuration object](api.md) and [#1316](https://github.com/log4js-node/log4js-node/pull/1316)) - `%x{<tokenname>}` add dynamic tokens to your log. Tokens are specified in the tokens parameter. - `%X{<tokenname>}` add values from the Logger context. Tokens are keys into the context values. - `%[` start a coloured block (colour will be taken from the log level, similar to `colouredLayout`) - `%]` end a coloured block ## Tokens User-defined tokens can be either a string or a function. Functions will be passed the log event, and should return a string. For example, you could define a custom token that outputs the log event's context value for 'user' like so: ```javascript log4js.configure({ appenders: { out: { type: "stdout", layout: { type: "pattern", pattern: "%d %p %c %x{user} %m%n", tokens: { user: function (logEvent) { return AuthLibrary.currentUser(); }, }, }, }, }, categories: { default: { appenders: ["out"], level: "info" } }, }); const logger = log4js.getLogger(); logger.info("doing something."); ``` This would output: ``` 2017-06-01 08:32:56.283 INFO default charlie doing something. ``` You can also use the Logger context to store tokens (sometimes called Nested Diagnostic Context, or Mapped Diagnostic Context) and use them in your layouts. ```javascript log4js.configure({ appenders: { out: { type: "stdout", layout: { type: "pattern", pattern: "%d %p %c %X{user} %m%n", }, }, }, categories: { default: { appenders: ["out"], level: "info" } }, }); const logger = log4js.getLogger(); logger.addContext("user", "charlie"); logger.info("doing something."); ``` This would output: ``` 2017-06-01 08:32:56.283 INFO default charlie doing something. ``` Note that you can also add functions to the Logger Context, and they will be passed the logEvent as well. # Adding your own layouts You can add your own layouts by calling `log4js.addLayout(type, fn)` before calling `log4js.configure`. `type` is the label you want to use to refer to your layout in appender configuration. `fn` is a function that takes a single object argument, which will contain the configuration for the layout instance, and returns a layout function. A layout function takes a log event argument and returns a string (usually, although you could return anything as long as the appender knows what to do with it). ## Custom Layout Example This example can also be found in examples/custom-layout.js. ```javascript const log4js = require("log4js"); log4js.addLayout("json", function (config) { return function (logEvent) { return JSON.stringify(logEvent) + config.separator; }; }); log4js.configure({ appenders: { out: { type: "stdout", layout: { type: "json", separator: "," } }, }, categories: { default: { appenders: ["out"], level: "info" }, }, }); const logger = log4js.getLogger("json-test"); logger.info("this is just a test"); logger.error("of a custom appender"); logger.warn("that outputs json"); log4js.shutdown(() => {}); ``` This example outputs the following: ```javascript {"startTime":"2017-06-05T22:23:08.479Z","categoryName":"json-test","data":["this is just a test"],"level":{"level":20000,"levelStr":"INFO"},"context":{}}, {"startTime":"2017-06-05T22:23:08.483Z","categoryName":"json-test","data":["of a custom appender"],"level":{"level":40000,"levelStr":"ERROR"},"context":{}}, {"startTime":"2017-06-05T22:23:08.483Z","categoryName":"json-test","data":["that outputs json"],"level":{"level":30000,"levelStr":"WARN"},"context":{}}, ```
# Layouts Layouts are functions used by appenders to format log events for output. They take a log event as an argument and return a string. Log4js comes with several appenders built-in, and provides ways to create your own if these are not suitable. For most use cases you will not need to configure layouts - there are some appenders which do not need layouts defined (for example, [logFaces-UDP](https://github.com/log4js-node/logFaces-UDP)); all the appenders that use layouts will have a sensible default defined. ## Configuration Most appender configuration will take a field called `layout`, which is an object - typically with a single field `type` which is the name of a layout defined below. Some layouts require extra configuration options, which should be included in the same object. ## Example ```javascript log4js.configure({ appenders: { out: { type: "stdout", layout: { type: "basic" } } }, categories: { default: { appenders: ["out"], level: "info" } }, }); ``` This configuration replaces the [stdout](stdout.md) appender's default `coloured` layout with `basic` layout. # Built-in Layouts ## Basic - `type` - `basic` Basic layout will output the timestamp, level, category, followed by the formatted log event data. ## Example ```javascript log4js.configure({ appenders: { out: { type: "stdout", layout: { type: "basic" } } }, categories: { default: { appenders: ["out"], level: "info" } }, }); const logger = log4js.getLogger("cheese"); logger.error("Cheese is too ripe!"); ``` This will output: ``` [2017-03-30 07:57:00.113] [ERROR] cheese - Cheese is too ripe! ``` ## Coloured - `type` - `coloured` (or `colored`) This layout is the same as `basic`, except that the timestamp, level and category will be coloured according to the log event's level (if your terminal/file supports it - if you see some weird characters in your output and no colour then you should probably switch to `basic`). The colours used are: - `TRACE` - 'blue' - `DEBUG` - 'cyan' - `INFO` - 'green' - `WARN` - 'yellow' - `ERROR` - 'red' - `FATAL` - 'magenta' ## Message Pass-Through - `type` - `messagePassThrough` This layout just formats the log event data, and does not output a timestamp, level or category. It is typically used in appenders that serialise the events using a specific format (e.g. [gelf](https://github.com/log4js-node/gelf)). ## Example ```javascript log4js.configure({ appenders: { out: { type: "stdout", layout: { type: "messagePassThrough" } }, }, categories: { default: { appenders: ["out"], level: "info" } }, }); const logger = log4js.getLogger("cheese"); const cheeseName = "gouda"; logger.error("Cheese is too ripe! Cheese was: ", cheeseName); ``` This will output: ``` Cheese is too ripe! Cheese was: gouda ``` ## Dummy - `type` - `dummy` This layout only outputs the first value in the log event's data. It was added for the [logstashUDP](https://github.com/log4js-node/logstashUDP) appender, and I'm not sure there's much use for it outside that. ## Example ```javascript log4js.configure({ appenders: { out: { type: "stdout", layout: { type: "dummy" } } }, categories: { default: { appenders: ["out"], level: "info" } }, }); const logger = log4js.getLogger("cheese"); const cheeseName = "gouda"; logger.error("Cheese is too ripe! Cheese was: ", cheeseName); ``` This will output: ``` Cheese is too ripe! Cheese was: ``` ## Pattern - `type` - `pattern` - `pattern` - `string` - specifier for the output format, using placeholders as described below - `tokens` - `object` (optional) - user-defined tokens to be used in the pattern ## Pattern format The pattern string can contain any characters, but sequences beginning with `%` will be replaced with values taken from the log event, and other environmental values. Format for specifiers is `%[padding].[truncation][field]{[format]}` - padding and truncation are optional, and format only applies to a few tokens (notably, date). Both padding and truncation values can be negative. - Positive truncation - truncate the string starting from the beginning - Negative truncation - truncate the string starting from the end of the string - Positive padding - left pad the string to make it this length, if the string is longer than the padding value then nothing happens - Negative padding - right pad the string to make it this length, if the string is longer than the padding value then nothing happens To make fixed-width columns in your log output, set padding and truncation to the same size (they don't have to have the same sign though, you could have right truncated, left padded columns that are always 10 characters wide with a pattern like "%10.-10m"). e.g. %5.10p - left pad the log level by up to 5 characters, keep the whole string to a max length of 10. So, for a log level of INFO the output would be " INFO", for DEBUG it would be "DEBUG" and for a (custom) log level of CATASTROPHIC it would be "CATASTROPH". Fields can be any of: - `%r` time in toLocaleTimeString format - `%p` log level - `%c` log category - `%h` hostname - `%m` log data - `%d` date, formatted - default is `ISO8601`, format options are: `ISO8601`, `ISO8601_WITH_TZ_OFFSET`, `ABSOLUTETIME`, `DATETIME`, or any string compatible with the [date-format](https://www.npmjs.com/package/date-format) library. e.g. `%d{DATETIME}`, `%d{yyyy/MM/dd-hh.mm.ss}` - `%%` % - for when you want a literal `%` in your output - `%n` newline - `%z` process id (from `process.pid`) - `%f` full path of filename (requires `enableCallStack: true` on the category, see [configuration object](api.md)) - `%f{depth}` path's depth let you chose to have only filename (`%f{1}`) or a chosen number of directories - `%l` line number (requires `enableCallStack: true` on the category, see [configuration object](api.md)) - `%o` column postion (requires `enableCallStack: true` on the category, see [configuration object](api.md)) - `%s` call stack (requires `enableCallStack: true` on the category, see [configuration object](api.md)) - `%C` class name (requires `enableCallStack: true` on the category, see [configuration object](api.md) and [#1316](https://github.com/log4js-node/log4js-node/pull/1316)) - `%M` method or function name (requires `enableCallStack: true` on the category, see [configuration object](api.md) and [#1316](https://github.com/log4js-node/log4js-node/pull/1316)) - `%A` method or function alias (requires `enableCallStack: true` on the category, see [configuration object](api.md) and [#1316](https://github.com/log4js-node/log4js-node/pull/1316)) - `%F` fully qualified caller name (requires `enableCallStack: true` on the category, see [configuration object](api.md) and [#1316](https://github.com/log4js-node/log4js-node/pull/1316)) - `%x{<tokenname>}` add dynamic tokens to your log. Tokens are specified in the tokens parameter. - `%X{<tokenname>}` add values from the Logger context. Tokens are keys into the context values. - `%[` start a coloured block (colour will be taken from the log level, similar to `colouredLayout`) - `%]` end a coloured block ## Tokens User-defined tokens can be either a string or a function. Functions will be passed the log event, and should return a string. For example, you could define a custom token that outputs the log event's context value for 'user' like so: ```javascript log4js.configure({ appenders: { out: { type: "stdout", layout: { type: "pattern", pattern: "%d %p %c %x{user} %m%n", tokens: { user: function (logEvent) { return AuthLibrary.currentUser(); }, }, }, }, }, categories: { default: { appenders: ["out"], level: "info" } }, }); const logger = log4js.getLogger(); logger.info("doing something."); ``` This would output: ``` 2017-06-01 08:32:56.283 INFO default charlie doing something. ``` You can also use the Logger context to store tokens (sometimes called Nested Diagnostic Context, or Mapped Diagnostic Context) and use them in your layouts. ```javascript log4js.configure({ appenders: { out: { type: "stdout", layout: { type: "pattern", pattern: "%d %p %c %X{user} %m%n", }, }, }, categories: { default: { appenders: ["out"], level: "info" } }, }); const logger = log4js.getLogger(); logger.addContext("user", "charlie"); logger.info("doing something."); ``` This would output: ``` 2017-06-01 08:32:56.283 INFO default charlie doing something. ``` Note that you can also add functions to the Logger Context, and they will be passed the logEvent as well. # Adding your own layouts You can add your own layouts by calling `log4js.addLayout(type, fn)` before calling `log4js.configure`. `type` is the label you want to use to refer to your layout in appender configuration. `fn` is a function that takes a single object argument, which will contain the configuration for the layout instance, and returns a layout function. A layout function takes a log event argument and returns a string (usually, although you could return anything as long as the appender knows what to do with it). ## Custom Layout Example This example can also be found in examples/custom-layout.js. ```javascript const log4js = require("log4js"); log4js.addLayout("json", function (config) { return function (logEvent) { return JSON.stringify(logEvent) + config.separator; }; }); log4js.configure({ appenders: { out: { type: "stdout", layout: { type: "json", separator: "," } }, }, categories: { default: { appenders: ["out"], level: "info" }, }, }); const logger = log4js.getLogger("json-test"); logger.info("this is just a test"); logger.error("of a custom appender"); logger.warn("that outputs json"); log4js.shutdown(() => {}); ``` This example outputs the following: ```javascript {"startTime":"2017-06-05T22:23:08.479Z","categoryName":"json-test","data":["this is just a test"],"level":{"level":20000,"levelStr":"INFO"},"context":{}}, {"startTime":"2017-06-05T22:23:08.483Z","categoryName":"json-test","data":["of a custom appender"],"level":{"level":40000,"levelStr":"ERROR"},"context":{}}, {"startTime":"2017-06-05T22:23:08.483Z","categoryName":"json-test","data":["that outputs json"],"level":{"level":30000,"levelStr":"WARN"},"context":{}}, ```
-1
log4js-node/log4js-node
1,334
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately
lamweili
"2022-10-01T18:55:40Z"
"2022-10-01T18:57:33Z"
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
7ca308dfde78ffc3ece68b77e30107590c61dc12
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately.
./lib/appenders/multiprocess.js
const debug = require('debug')('log4js:multiprocess'); const net = require('net'); const LoggingEvent = require('../LoggingEvent'); const END_MSG = '__LOG4JS__'; /** * Creates a server, listening on config.loggerPort, config.loggerHost. * Output goes to config.actualAppender (config.appender is used to * set up that appender). */ function logServer(config, actualAppender, levels) { /** * Takes a utf-8 string, returns an object with * the correct log properties. */ function deserializeLoggingEvent(clientSocket, msg) { debug('(master) deserialising log event'); const loggingEvent = LoggingEvent.deserialise(msg); loggingEvent.remoteAddress = clientSocket.remoteAddress; loggingEvent.remotePort = clientSocket.remotePort; return loggingEvent; } const server = net.createServer((clientSocket) => { debug('(master) connection received'); clientSocket.setEncoding('utf8'); let logMessage = ''; function logTheMessage(msg) { debug('(master) deserialising log event and sending to actual appender'); actualAppender(deserializeLoggingEvent(clientSocket, msg)); } function chunkReceived(chunk) { debug('(master) chunk of data received'); let event; logMessage += chunk || ''; if (logMessage.indexOf(END_MSG) > -1) { event = logMessage.slice(0, logMessage.indexOf(END_MSG)); logTheMessage(event); logMessage = logMessage.slice(event.length + END_MSG.length) || ''; // check for more, maybe it was a big chunk chunkReceived(); } } function handleError(error) { const loggingEvent = { startTime: new Date(), categoryName: 'log4js', level: levels.ERROR, data: ['A worker log process hung up unexpectedly', error], remoteAddress: clientSocket.remoteAddress, remotePort: clientSocket.remotePort, }; actualAppender(loggingEvent); } clientSocket.on('data', chunkReceived); clientSocket.on('end', chunkReceived); clientSocket.on('error', handleError); }); server.listen( config.loggerPort || 5000, config.loggerHost || 'localhost', (e) => { debug('(master) master server listening, error was ', e); // allow the process to exit, if this is the only socket active server.unref(); } ); function app(event) { debug('(master) log event sent directly to actual appender (local event)'); return actualAppender(event); } app.shutdown = function (cb) { debug('(master) master shutdown called, closing server'); server.close(cb); }; return app; } function workerAppender(config) { let canWrite = false; const buffer = []; let socket; let shutdownAttempts = 3; function write(loggingEvent) { debug('(worker) Writing log event to socket'); socket.write(loggingEvent.serialise(), 'utf8'); socket.write(END_MSG, 'utf8'); } function emptyBuffer() { let evt; debug('(worker) emptying worker buffer'); while ((evt = buffer.shift())) { write(evt); } } function createSocket() { debug( `(worker) worker appender creating socket to ${ config.loggerHost || 'localhost' }:${config.loggerPort || 5000}` ); socket = net.createConnection( config.loggerPort || 5000, config.loggerHost || 'localhost' ); socket.on('connect', () => { debug('(worker) worker socket connected'); emptyBuffer(); canWrite = true; }); socket.on('timeout', socket.end.bind(socket)); socket.on('error', (e) => { debug('connection error', e); canWrite = false; emptyBuffer(); }); socket.on('close', createSocket); } createSocket(); function log(loggingEvent) { if (canWrite) { write(loggingEvent); } else { debug( '(worker) worker buffering log event because it cannot write at the moment' ); buffer.push(loggingEvent); } } log.shutdown = function (cb) { debug('(worker) worker shutdown called'); if (buffer.length && shutdownAttempts) { debug('(worker) worker buffer has items, waiting 100ms to empty'); shutdownAttempts -= 1; setTimeout(() => { log.shutdown(cb); }, 100); } else { socket.removeAllListeners('close'); socket.end(cb); } }; return log; } function createAppender(config, appender, levels) { if (config.mode === 'master') { debug('Creating master appender'); return logServer(config, appender, levels); } debug('Creating worker appender'); return workerAppender(config); } function configure(config, layouts, findAppender, levels) { let appender; debug(`configure with mode = ${config.mode}`); if (config.mode === 'master') { if (!config.appender) { debug(`no appender found in config ${config}`); throw new Error('multiprocess master must have an "appender" defined'); } debug(`actual appender is ${config.appender}`); appender = findAppender(config.appender); if (!appender) { debug(`actual appender "${config.appender}" not found`); throw new Error( `multiprocess master appender "${config.appender}" not defined` ); } } return createAppender(config, appender, levels); } module.exports.configure = configure;
const debug = require('debug')('log4js:multiprocess'); const net = require('net'); const LoggingEvent = require('../LoggingEvent'); const END_MSG = '__LOG4JS__'; /** * Creates a server, listening on config.loggerPort, config.loggerHost. * Output goes to config.actualAppender (config.appender is used to * set up that appender). */ function logServer(config, actualAppender, levels) { /** * Takes a utf-8 string, returns an object with * the correct log properties. */ function deserializeLoggingEvent(clientSocket, msg) { debug('(master) deserialising log event'); const loggingEvent = LoggingEvent.deserialise(msg); loggingEvent.remoteAddress = clientSocket.remoteAddress; loggingEvent.remotePort = clientSocket.remotePort; return loggingEvent; } const server = net.createServer((clientSocket) => { debug('(master) connection received'); clientSocket.setEncoding('utf8'); let logMessage = ''; function logTheMessage(msg) { debug('(master) deserialising log event and sending to actual appender'); actualAppender(deserializeLoggingEvent(clientSocket, msg)); } function chunkReceived(chunk) { debug('(master) chunk of data received'); let event; logMessage += chunk || ''; if (logMessage.indexOf(END_MSG) > -1) { event = logMessage.slice(0, logMessage.indexOf(END_MSG)); logTheMessage(event); logMessage = logMessage.slice(event.length + END_MSG.length) || ''; // check for more, maybe it was a big chunk chunkReceived(); } } function handleError(error) { const loggingEvent = { startTime: new Date(), categoryName: 'log4js', level: levels.ERROR, data: ['A worker log process hung up unexpectedly', error], remoteAddress: clientSocket.remoteAddress, remotePort: clientSocket.remotePort, }; actualAppender(loggingEvent); } clientSocket.on('data', chunkReceived); clientSocket.on('end', chunkReceived); clientSocket.on('error', handleError); }); server.listen( config.loggerPort || 5000, config.loggerHost || 'localhost', (e) => { debug('(master) master server listening, error was ', e); // allow the process to exit, if this is the only socket active server.unref(); } ); function app(event) { debug('(master) log event sent directly to actual appender (local event)'); return actualAppender(event); } app.shutdown = function (cb) { debug('(master) master shutdown called, closing server'); server.close(cb); }; return app; } function workerAppender(config) { let canWrite = false; const buffer = []; let socket; let shutdownAttempts = 3; function write(loggingEvent) { debug('(worker) Writing log event to socket'); socket.write(loggingEvent.serialise(), 'utf8'); socket.write(END_MSG, 'utf8'); } function emptyBuffer() { let evt; debug('(worker) emptying worker buffer'); while ((evt = buffer.shift())) { write(evt); } } function createSocket() { debug( `(worker) worker appender creating socket to ${ config.loggerHost || 'localhost' }:${config.loggerPort || 5000}` ); socket = net.createConnection( config.loggerPort || 5000, config.loggerHost || 'localhost' ); socket.on('connect', () => { debug('(worker) worker socket connected'); emptyBuffer(); canWrite = true; }); socket.on('timeout', socket.end.bind(socket)); socket.on('error', (e) => { debug('connection error', e); canWrite = false; emptyBuffer(); }); socket.on('close', createSocket); } createSocket(); function log(loggingEvent) { if (canWrite) { write(loggingEvent); } else { debug( '(worker) worker buffering log event because it cannot write at the moment' ); buffer.push(loggingEvent); } } log.shutdown = function (cb) { debug('(worker) worker shutdown called'); if (buffer.length && shutdownAttempts) { debug('(worker) worker buffer has items, waiting 100ms to empty'); shutdownAttempts -= 1; setTimeout(() => { log.shutdown(cb); }, 100); } else { socket.removeAllListeners('close'); socket.end(cb); } }; return log; } function createAppender(config, appender, levels) { if (config.mode === 'master') { debug('Creating master appender'); return logServer(config, appender, levels); } debug('Creating worker appender'); return workerAppender(config); } function configure(config, layouts, findAppender, levels) { let appender; debug(`configure with mode = ${config.mode}`); if (config.mode === 'master') { if (!config.appender) { debug(`no appender found in config ${config}`); throw new Error('multiprocess master must have an "appender" defined'); } debug(`actual appender is ${config.appender}`); appender = findAppender(config.appender); if (!appender) { debug(`actual appender "${config.appender}" not found`); throw new Error( `multiprocess master appender "${config.appender}" not defined` ); } } return createAppender(config, appender, levels); } module.exports.configure = configure;
-1
log4js-node/log4js-node
1,334
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately
lamweili
"2022-10-01T18:55:40Z"
"2022-10-01T18:57:33Z"
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
7ca308dfde78ffc3ece68b77e30107590c61dc12
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately.
./test/tap/newLevel-test.js
const { test } = require('tap'); const log4js = require('../../lib/log4js'); const recording = require('../../lib/appenders/recording'); test('../../lib/logger', (batch) => { batch.beforeEach((done) => { recording.reset(); if (typeof done === 'function') { done(); } }); batch.test('creating a new log level', (t) => { log4js.configure({ levels: { DIAG: { value: 6000, colour: 'green' }, }, appenders: { stdout: { type: 'stdout' }, }, categories: { default: { appenders: ['stdout'], level: 'trace' }, }, }); const logger = log4js.getLogger(); t.test('should export new log level in levels module', (assert) => { assert.ok(log4js.levels.DIAG); assert.equal(log4js.levels.DIAG.levelStr, 'DIAG'); assert.equal(log4js.levels.DIAG.level, 6000); assert.equal(log4js.levels.DIAG.colour, 'green'); assert.end(); }); t.type( logger.diag, 'function', 'should create named function on logger prototype' ); t.type( logger.isDiagEnabled, 'function', 'should create isLevelEnabled function on logger prototype' ); t.type(logger.info, 'function', 'should retain default levels'); t.end(); }); batch.test('creating a new log level with underscores', (t) => { log4js.configure({ levels: { NEW_LEVEL_OTHER: { value: 6000, colour: 'blue' }, }, appenders: { stdout: { type: 'stdout' } }, categories: { default: { appenders: ['stdout'], level: 'trace' } }, }); const logger = log4js.getLogger(); t.test('should export new log level to levels module', (assert) => { assert.ok(log4js.levels.NEW_LEVEL_OTHER); assert.equal(log4js.levels.NEW_LEVEL_OTHER.levelStr, 'NEW_LEVEL_OTHER'); assert.equal(log4js.levels.NEW_LEVEL_OTHER.level, 6000); assert.equal(log4js.levels.NEW_LEVEL_OTHER.colour, 'blue'); assert.end(); }); t.type( logger.newLevelOther, 'function', 'should create named function on logger prototype in camel case' ); t.type( logger.isNewLevelOtherEnabled, 'function', 'should create named isLevelEnabled function on logger prototype in camel case' ); t.end(); }); batch.test('creating log events containing newly created log level', (t) => { log4js.configure({ levels: { LVL1: { value: 6000, colour: 'grey' }, LVL2: { value: 5000, colour: 'magenta' }, }, appenders: { recorder: { type: 'recording' } }, categories: { default: { appenders: ['recorder'], level: 'LVL1' }, }, }); const logger = log4js.getLogger(); logger.log(log4js.levels.getLevel('LVL1', log4js.levels.DEBUG), 'Event 1'); logger.log(log4js.levels.getLevel('LVL1'), 'Event 2'); logger.log('LVL1', 'Event 3'); logger.lvl1('Event 4'); logger.lvl2('Event 5'); const events = recording.replay(); t.test('should show log events with new log level', (assert) => { assert.equal(events[0].level.toString(), 'LVL1'); assert.equal(events[0].data[0], 'Event 1'); assert.equal(events[1].level.toString(), 'LVL1'); assert.equal(events[1].data[0], 'Event 2'); assert.equal(events[2].level.toString(), 'LVL1'); assert.equal(events[2].data[0], 'Event 3'); assert.equal(events[3].level.toString(), 'LVL1'); assert.equal(events[3].data[0], 'Event 4'); assert.end(); }); t.equal( events.length, 4, 'should not be present if min log level is greater than newly created level' ); t.end(); }); batch.test('creating a new log level with incorrect parameters', (t) => { t.throws(() => { log4js.configure({ levels: { cheese: { value: 'biscuits' }, }, appenders: { stdout: { type: 'stdout' } }, categories: { default: { appenders: ['stdout'], level: 'trace' } }, }); }, 'level "cheese".value must have an integer value'); t.throws(() => { log4js.configure({ levels: { cheese: 'biscuits', }, appenders: { stdout: { type: 'stdout' } }, categories: { default: { appenders: ['stdout'], level: 'trace' } }, }); }, 'level "cheese" must be an object'); t.throws(() => { log4js.configure({ levels: { cheese: { thing: 'biscuits' }, }, appenders: { stdout: { type: 'stdout' } }, categories: { default: { appenders: ['stdout'], level: 'trace' } }, }); }, 'level "cheese" must have a \'value\' property'); t.throws(() => { log4js.configure({ levels: { cheese: { value: 3 }, }, appenders: { stdout: { type: 'stdout' } }, categories: { default: { appenders: ['stdout'], level: 'trace' } }, }); }, 'level "cheese" must have a \'colour\' property'); t.throws(() => { log4js.configure({ levels: { cheese: { value: 3, colour: 'pants' }, }, appenders: { stdout: { type: 'stdout' } }, categories: { default: { appenders: ['stdout'], level: 'trace' } }, }); }, 'level "cheese".colour must be one of white, grey, black, blue, cyan, green, magenta, red, yellow'); t.throws(() => { log4js.configure({ levels: { '#pants': 3, }, appenders: { stdout: { type: 'stdout' } }, categories: { default: { appenders: ['stdout'], level: 'trace' } }, }); }, 'level name "#pants" is not a valid identifier (must start with a letter, only contain A-Z,a-z,0-9,_)'); t.throws(() => { log4js.configure({ levels: { 'thing#pants': 3, }, appenders: { stdout: { type: 'stdout' } }, categories: { default: { appenders: ['stdout'], level: 'trace' } }, }); }, 'level name "thing#pants" is not a valid identifier (must start with a letter, only contain A-Z,a-z,0-9,_)'); t.throws(() => { log4js.configure({ levels: { '1pants': 3, }, appenders: { stdout: { type: 'stdout' } }, categories: { default: { appenders: ['stdout'], level: 'trace' } }, }); }, 'level name "1pants" is not a valid identifier (must start with a letter, only contain A-Z,a-z,0-9,_)'); t.throws(() => { log4js.configure({ levels: { 2: 3, }, appenders: { stdout: { type: 'stdout' } }, categories: { default: { appenders: ['stdout'], level: 'trace' } }, }); }, 'level name "2" is not a valid identifier (must start with a letter, only contain A-Z,a-z,0-9,_)'); t.throws(() => { log4js.configure({ levels: { 'cheese!': 3, }, appenders: { stdout: { type: 'stdout' } }, categories: { default: { appenders: ['stdout'], level: 'trace' } }, }); }, 'level name "cheese!" is not a valid identifier (must start with a letter, only contain A-Z,a-z,0-9,_)'); t.end(); }); batch.test('calling log with an undefined log level', (t) => { log4js.configure({ appenders: { recorder: { type: 'recording' } }, categories: { default: { appenders: ['recorder'], level: 'trace' } }, }); const logger = log4js.getLogger(); // fallback behavior logger.log('LEVEL_DOES_NOT_EXIST', 'Event 1'); logger.log( log4js.levels.getLevel('LEVEL_DOES_NOT_EXIST'), 'Event 2', '2 Text' ); // synonym behavior logger.log('Event 3'); logger.log('Event 4', '4 Text'); const events = recording.replay(); t.equal(events[0].level.toString(), 'WARN', 'should log warning'); t.equal( events[0].data[0], 'log4js:logger.log: valid log-level not found as first parameter given:' ); t.equal(events[0].data[1], 'LEVEL_DOES_NOT_EXIST'); t.equal(events[1].level.toString(), 'INFO', 'should fall back to INFO'); t.equal(events[1].data[0], '[LEVEL_DOES_NOT_EXIST]'); t.equal(events[1].data[1], 'Event 1'); t.equal(events[2].level.toString(), 'WARN', 'should log warning'); t.equal( events[2].data[0], 'log4js:logger.log: valid log-level not found as first parameter given:' ); t.equal(events[2].data[1], undefined); t.equal(events[3].level.toString(), 'INFO', 'should fall back to INFO'); t.equal(events[3].data[0], '[undefined]'); t.equal(events[3].data[1], 'Event 2'); t.equal(events[3].data[2], '2 Text'); t.equal(events[4].level.toString(), 'INFO', 'LOG is synonym of INFO'); t.equal(events[4].data[0], 'Event 3'); t.equal(events[5].level.toString(), 'INFO', 'LOG is synonym of INFO'); t.equal(events[5].data[0], 'Event 4'); t.equal(events[5].data[1], '4 Text'); t.end(); }); batch.test('creating a new level with an existing level name', (t) => { log4js.configure({ levels: { info: { value: 1234, colour: 'blue' }, }, appenders: { recorder: { type: 'recording' } }, categories: { default: { appenders: ['recorder'], level: 'all' } }, }); t.equal( log4js.levels.INFO.level, 1234, 'should override the existing log level' ); t.equal( log4js.levels.INFO.colour, 'blue', 'should override the existing log level' ); const logger = log4js.getLogger(); logger.info('test message'); const events = recording.replay(); t.equal( events[0].level.level, 1234, 'should override the existing log level' ); t.end(); }); batch.end(); });
const { test } = require('tap'); const log4js = require('../../lib/log4js'); const recording = require('../../lib/appenders/recording'); test('../../lib/logger', (batch) => { batch.beforeEach((done) => { recording.reset(); if (typeof done === 'function') { done(); } }); batch.test('creating a new log level', (t) => { log4js.configure({ levels: { DIAG: { value: 6000, colour: 'green' }, }, appenders: { stdout: { type: 'stdout' }, }, categories: { default: { appenders: ['stdout'], level: 'trace' }, }, }); const logger = log4js.getLogger(); t.test('should export new log level in levels module', (assert) => { assert.ok(log4js.levels.DIAG); assert.equal(log4js.levels.DIAG.levelStr, 'DIAG'); assert.equal(log4js.levels.DIAG.level, 6000); assert.equal(log4js.levels.DIAG.colour, 'green'); assert.end(); }); t.type( logger.diag, 'function', 'should create named function on logger prototype' ); t.type( logger.isDiagEnabled, 'function', 'should create isLevelEnabled function on logger prototype' ); t.type(logger.info, 'function', 'should retain default levels'); t.end(); }); batch.test('creating a new log level with underscores', (t) => { log4js.configure({ levels: { NEW_LEVEL_OTHER: { value: 6000, colour: 'blue' }, }, appenders: { stdout: { type: 'stdout' } }, categories: { default: { appenders: ['stdout'], level: 'trace' } }, }); const logger = log4js.getLogger(); t.test('should export new log level to levels module', (assert) => { assert.ok(log4js.levels.NEW_LEVEL_OTHER); assert.equal(log4js.levels.NEW_LEVEL_OTHER.levelStr, 'NEW_LEVEL_OTHER'); assert.equal(log4js.levels.NEW_LEVEL_OTHER.level, 6000); assert.equal(log4js.levels.NEW_LEVEL_OTHER.colour, 'blue'); assert.end(); }); t.type( logger.newLevelOther, 'function', 'should create named function on logger prototype in camel case' ); t.type( logger.isNewLevelOtherEnabled, 'function', 'should create named isLevelEnabled function on logger prototype in camel case' ); t.end(); }); batch.test('creating log events containing newly created log level', (t) => { log4js.configure({ levels: { LVL1: { value: 6000, colour: 'grey' }, LVL2: { value: 5000, colour: 'magenta' }, }, appenders: { recorder: { type: 'recording' } }, categories: { default: { appenders: ['recorder'], level: 'LVL1' }, }, }); const logger = log4js.getLogger(); logger.log(log4js.levels.getLevel('LVL1', log4js.levels.DEBUG), 'Event 1'); logger.log(log4js.levels.getLevel('LVL1'), 'Event 2'); logger.log('LVL1', 'Event 3'); logger.lvl1('Event 4'); logger.lvl2('Event 5'); const events = recording.replay(); t.test('should show log events with new log level', (assert) => { assert.equal(events[0].level.toString(), 'LVL1'); assert.equal(events[0].data[0], 'Event 1'); assert.equal(events[1].level.toString(), 'LVL1'); assert.equal(events[1].data[0], 'Event 2'); assert.equal(events[2].level.toString(), 'LVL1'); assert.equal(events[2].data[0], 'Event 3'); assert.equal(events[3].level.toString(), 'LVL1'); assert.equal(events[3].data[0], 'Event 4'); assert.end(); }); t.equal( events.length, 4, 'should not be present if min log level is greater than newly created level' ); t.end(); }); batch.test('creating a new log level with incorrect parameters', (t) => { t.throws(() => { log4js.configure({ levels: { cheese: { value: 'biscuits' }, }, appenders: { stdout: { type: 'stdout' } }, categories: { default: { appenders: ['stdout'], level: 'trace' } }, }); }, 'level "cheese".value must have an integer value'); t.throws(() => { log4js.configure({ levels: { cheese: 'biscuits', }, appenders: { stdout: { type: 'stdout' } }, categories: { default: { appenders: ['stdout'], level: 'trace' } }, }); }, 'level "cheese" must be an object'); t.throws(() => { log4js.configure({ levels: { cheese: { thing: 'biscuits' }, }, appenders: { stdout: { type: 'stdout' } }, categories: { default: { appenders: ['stdout'], level: 'trace' } }, }); }, 'level "cheese" must have a \'value\' property'); t.throws(() => { log4js.configure({ levels: { cheese: { value: 3 }, }, appenders: { stdout: { type: 'stdout' } }, categories: { default: { appenders: ['stdout'], level: 'trace' } }, }); }, 'level "cheese" must have a \'colour\' property'); t.throws(() => { log4js.configure({ levels: { cheese: { value: 3, colour: 'pants' }, }, appenders: { stdout: { type: 'stdout' } }, categories: { default: { appenders: ['stdout'], level: 'trace' } }, }); }, 'level "cheese".colour must be one of white, grey, black, blue, cyan, green, magenta, red, yellow'); t.throws(() => { log4js.configure({ levels: { '#pants': 3, }, appenders: { stdout: { type: 'stdout' } }, categories: { default: { appenders: ['stdout'], level: 'trace' } }, }); }, 'level name "#pants" is not a valid identifier (must start with a letter, only contain A-Z,a-z,0-9,_)'); t.throws(() => { log4js.configure({ levels: { 'thing#pants': 3, }, appenders: { stdout: { type: 'stdout' } }, categories: { default: { appenders: ['stdout'], level: 'trace' } }, }); }, 'level name "thing#pants" is not a valid identifier (must start with a letter, only contain A-Z,a-z,0-9,_)'); t.throws(() => { log4js.configure({ levels: { '1pants': 3, }, appenders: { stdout: { type: 'stdout' } }, categories: { default: { appenders: ['stdout'], level: 'trace' } }, }); }, 'level name "1pants" is not a valid identifier (must start with a letter, only contain A-Z,a-z,0-9,_)'); t.throws(() => { log4js.configure({ levels: { 2: 3, }, appenders: { stdout: { type: 'stdout' } }, categories: { default: { appenders: ['stdout'], level: 'trace' } }, }); }, 'level name "2" is not a valid identifier (must start with a letter, only contain A-Z,a-z,0-9,_)'); t.throws(() => { log4js.configure({ levels: { 'cheese!': 3, }, appenders: { stdout: { type: 'stdout' } }, categories: { default: { appenders: ['stdout'], level: 'trace' } }, }); }, 'level name "cheese!" is not a valid identifier (must start with a letter, only contain A-Z,a-z,0-9,_)'); t.end(); }); batch.test('calling log with an undefined log level', (t) => { log4js.configure({ appenders: { recorder: { type: 'recording' } }, categories: { default: { appenders: ['recorder'], level: 'trace' } }, }); const logger = log4js.getLogger(); // fallback behavior logger.log('LEVEL_DOES_NOT_EXIST', 'Event 1'); logger.log( log4js.levels.getLevel('LEVEL_DOES_NOT_EXIST'), 'Event 2', '2 Text' ); // synonym behavior logger.log('Event 3'); logger.log('Event 4', '4 Text'); const events = recording.replay(); t.equal(events[0].level.toString(), 'WARN', 'should log warning'); t.equal( events[0].data[0], 'log4js:logger.log: valid log-level not found as first parameter given:' ); t.equal(events[0].data[1], 'LEVEL_DOES_NOT_EXIST'); t.equal(events[1].level.toString(), 'INFO', 'should fall back to INFO'); t.equal(events[1].data[0], '[LEVEL_DOES_NOT_EXIST]'); t.equal(events[1].data[1], 'Event 1'); t.equal(events[2].level.toString(), 'WARN', 'should log warning'); t.equal( events[2].data[0], 'log4js:logger.log: valid log-level not found as first parameter given:' ); t.equal(events[2].data[1], undefined); t.equal(events[3].level.toString(), 'INFO', 'should fall back to INFO'); t.equal(events[3].data[0], '[undefined]'); t.equal(events[3].data[1], 'Event 2'); t.equal(events[3].data[2], '2 Text'); t.equal(events[4].level.toString(), 'INFO', 'LOG is synonym of INFO'); t.equal(events[4].data[0], 'Event 3'); t.equal(events[5].level.toString(), 'INFO', 'LOG is synonym of INFO'); t.equal(events[5].data[0], 'Event 4'); t.equal(events[5].data[1], '4 Text'); t.end(); }); batch.test('creating a new level with an existing level name', (t) => { log4js.configure({ levels: { info: { value: 1234, colour: 'blue' }, }, appenders: { recorder: { type: 'recording' } }, categories: { default: { appenders: ['recorder'], level: 'all' } }, }); t.equal( log4js.levels.INFO.level, 1234, 'should override the existing log level' ); t.equal( log4js.levels.INFO.colour, 'blue', 'should override the existing log level' ); const logger = log4js.getLogger(); logger.info('test message'); const events = recording.replay(); t.equal( events[0].level.level, 1234, 'should override the existing log level' ); t.end(); }); batch.end(); });
-1
log4js-node/log4js-node
1,334
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately
lamweili
"2022-10-01T18:55:40Z"
"2022-10-01T18:57:33Z"
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
7ca308dfde78ffc3ece68b77e30107590c61dc12
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately.
./docs/writing-appenders.md
# Writing Appenders for Log4js Log4js can load appenders from outside its core set. To add a custom appender, the easiest way is to make it a stand-alone module and publish to npm. You can also load appenders from your own application, but they must be defined in a module. ## Loading mechanism When log4js parses your configuration, it loops through the defined appenders. For each one, it will `require` the appender initially using the `type` value prepended with './appenders' as the module identifier - this is to try loading from the core appenders first. If that fails (the module could not be found in the core appenders), then log4js will try to require the module using variations of the `type` value. Log4js checks the following places (in this order) for appenders based on the type value: 1. Bundled core appenders (within appenders directory): `require('./' + type)` 2. node_modules: `require(type)` 3. relative to the main file of your application: `require(path.dirname(require.main.filename) + '/' + type)` 4. relative to the process' current working directory: `require(process.cwd() + '/' + type)` If that fails, an error will be raised. ## Appender Modules An appender module should export a single function called `configure`. The function should accept the following arguments: - `config` - `object` - the appender's configuration object - `layouts` - `module` - gives access to the [layouts](layouts.md) module, which most appenders will need - `layout` - `function(type, config)` - this is the main function that appenders will use to find a layout - `findAppender` - `function(name)` - if your appender is a wrapper around another appender (like the [logLevelFilter](logLevelFilter.md) for example), this function can be used to find another appender by name - `levels` - `module` - gives access to the [levels](levels.md) module, which most appenders will need `configure` should return a function which accepts a logEvent, which is the appender itself. One of the simplest examples is the [stdout](stdout.md) appender. Let's run through the code. ## Example ```javascript // This is the function that generates an appender function function stdoutAppender(layout, timezoneOffset) { // This is the appender function itself return (loggingEvent) => { process.stdout.write(`${layout(loggingEvent, timezoneOffset)}\n`); }; } // stdout configure doesn't need to use findAppender, or levels function configure(config, layouts) { // the default layout for the appender let layout = layouts.colouredLayout; // check if there is another layout specified if (config.layout) { // load the layout layout = layouts.layout(config.layout.type, config.layout); } //create a new appender instance return stdoutAppender(layout, config.timezoneOffset); } //export the only function needed exports.configure = configure; ``` # Shutdown functions It's a good idea to implement a `shutdown` function on your appender instances. This function will get called by `log4js.shutdown` and signals that `log4js` has been asked to stop logging. Usually this is because of a fatal exception, or the application is being stopped. Your shutdown function should make sure that all asynchronous operations finish, and that any resources are cleaned up. The function must be named `shutdown`, take one callback argument, and be a property of the appender instance. Let's add a shutdown function to the `stdout` appender as an example. ## Example (shutdown) ```javascript // This is the function that generates an appender function function stdoutAppender(layout, timezoneOffset) { // This is the appender function itself const appender = (loggingEvent) => { process.stdout.write(`${layout(loggingEvent, timezoneOffset)}\n`); }; // add a shutdown function. appender.shutdown = (done) => { process.stdout.write("", done); }; return appender; } // ... rest of the code as above ```
# Writing Appenders for Log4js Log4js can load appenders from outside its core set. To add a custom appender, the easiest way is to make it a stand-alone module and publish to npm. You can also load appenders from your own application, but they must be defined in a module. ## Loading mechanism When log4js parses your configuration, it loops through the defined appenders. For each one, it will `require` the appender initially using the `type` value prepended with './appenders' as the module identifier - this is to try loading from the core appenders first. If that fails (the module could not be found in the core appenders), then log4js will try to require the module using variations of the `type` value. Log4js checks the following places (in this order) for appenders based on the type value: 1. Bundled core appenders (within appenders directory): `require('./' + type)` 2. node_modules: `require(type)` 3. relative to the main file of your application: `require(path.dirname(require.main.filename) + '/' + type)` 4. relative to the process' current working directory: `require(process.cwd() + '/' + type)` If that fails, an error will be raised. ## Appender Modules An appender module should export a single function called `configure`. The function should accept the following arguments: - `config` - `object` - the appender's configuration object - `layouts` - `module` - gives access to the [layouts](layouts.md) module, which most appenders will need - `layout` - `function(type, config)` - this is the main function that appenders will use to find a layout - `findAppender` - `function(name)` - if your appender is a wrapper around another appender (like the [logLevelFilter](logLevelFilter.md) for example), this function can be used to find another appender by name - `levels` - `module` - gives access to the [levels](levels.md) module, which most appenders will need `configure` should return a function which accepts a logEvent, which is the appender itself. One of the simplest examples is the [stdout](stdout.md) appender. Let's run through the code. ## Example ```javascript // This is the function that generates an appender function function stdoutAppender(layout, timezoneOffset) { // This is the appender function itself return (loggingEvent) => { process.stdout.write(`${layout(loggingEvent, timezoneOffset)}\n`); }; } // stdout configure doesn't need to use findAppender, or levels function configure(config, layouts) { // the default layout for the appender let layout = layouts.colouredLayout; // check if there is another layout specified if (config.layout) { // load the layout layout = layouts.layout(config.layout.type, config.layout); } //create a new appender instance return stdoutAppender(layout, config.timezoneOffset); } //export the only function needed exports.configure = configure; ``` # Shutdown functions It's a good idea to implement a `shutdown` function on your appender instances. This function will get called by `log4js.shutdown` and signals that `log4js` has been asked to stop logging. Usually this is because of a fatal exception, or the application is being stopped. Your shutdown function should make sure that all asynchronous operations finish, and that any resources are cleaned up. The function must be named `shutdown`, take one callback argument, and be a property of the appender instance. Let's add a shutdown function to the `stdout` appender as an example. ## Example (shutdown) ```javascript // This is the function that generates an appender function function stdoutAppender(layout, timezoneOffset) { // This is the appender function itself const appender = (loggingEvent) => { process.stdout.write(`${layout(loggingEvent, timezoneOffset)}\n`); }; // add a shutdown function. appender.shutdown = (done) => { process.stdout.write("", done); }; return appender; } // ... rest of the code as above ```
-1
log4js-node/log4js-node
1,334
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately
lamweili
"2022-10-01T18:55:40Z"
"2022-10-01T18:57:33Z"
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
7ca308dfde78ffc3ece68b77e30107590c61dc12
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately.
./examples/example.js
'use strict'; const log4js = require('../lib/log4js'); // log the cheese logger messages to a file, and the console ones as well. log4js.configure({ appenders: { cheeseLogs: { type: 'file', filename: 'cheese.log' }, console: { type: 'console' }, }, categories: { cheese: { appenders: ['cheeseLogs'], level: 'error' }, another: { appenders: ['console'], level: 'trace' }, default: { appenders: ['console', 'cheeseLogs'], level: 'trace' }, }, }); // a custom logger outside of the log4js/lib/appenders directory can be accessed like so // log4js.configure({ // appenders: { outside: { type: 'what/you/would/put/in/require', otherArgs: 'blah' } } // ... // }); const logger = log4js.getLogger('cheese'); // only errors and above get logged. const otherLogger = log4js.getLogger(); // this will get coloured output on console, and appear in cheese.log otherLogger.error('AAArgh! Something went wrong', { some: 'otherObject', useful_for: 'debug purposes', }); otherLogger.log('This should appear as info output'); // these will not appear (logging level beneath error) logger.trace('Entering cheese testing'); logger.debug('Got cheese.'); logger.info('Cheese is Gouda.'); logger.log('Something funny about cheese.'); logger.warn('Cheese is quite smelly.'); // these end up only in cheese.log logger.error('Cheese %s is too ripe!', 'gouda'); logger.fatal('Cheese was breeding ground for listeria.'); // these don't end up in cheese.log, but will appear on the console const anotherLogger = log4js.getLogger('another'); anotherLogger.debug('Just checking'); // will also go to console and cheese.log, since that's configured for all categories const pantsLog = log4js.getLogger('pants'); pantsLog.debug('Something for pants');
'use strict'; const log4js = require('../lib/log4js'); // log the cheese logger messages to a file, and the console ones as well. log4js.configure({ appenders: { cheeseLogs: { type: 'file', filename: 'cheese.log' }, console: { type: 'console' }, }, categories: { cheese: { appenders: ['cheeseLogs'], level: 'error' }, another: { appenders: ['console'], level: 'trace' }, default: { appenders: ['console', 'cheeseLogs'], level: 'trace' }, }, }); // a custom logger outside of the log4js/lib/appenders directory can be accessed like so // log4js.configure({ // appenders: { outside: { type: 'what/you/would/put/in/require', otherArgs: 'blah' } } // ... // }); const logger = log4js.getLogger('cheese'); // only errors and above get logged. const otherLogger = log4js.getLogger(); // this will get coloured output on console, and appear in cheese.log otherLogger.error('AAArgh! Something went wrong', { some: 'otherObject', useful_for: 'debug purposes', }); otherLogger.log('This should appear as info output'); // these will not appear (logging level beneath error) logger.trace('Entering cheese testing'); logger.debug('Got cheese.'); logger.info('Cheese is Gouda.'); logger.log('Something funny about cheese.'); logger.warn('Cheese is quite smelly.'); // these end up only in cheese.log logger.error('Cheese %s is too ripe!', 'gouda'); logger.fatal('Cheese was breeding ground for listeria.'); // these don't end up in cheese.log, but will appear on the console const anotherLogger = log4js.getLogger('another'); anotherLogger.debug('Just checking'); // will also go to console and cheese.log, since that's configured for all categories const pantsLog = log4js.getLogger('pants'); pantsLog.debug('Something for pants');
-1
log4js-node/log4js-node
1,334
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately
lamweili
"2022-10-01T18:55:40Z"
"2022-10-01T18:57:33Z"
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
7ca308dfde78ffc3ece68b77e30107590c61dc12
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately.
./lib/appenders/stdout.js
function stdoutAppender(layout, timezoneOffset) { return (loggingEvent) => { process.stdout.write(`${layout(loggingEvent, timezoneOffset)}\n`); }; } function configure(config, layouts) { let layout = layouts.colouredLayout; if (config.layout) { layout = layouts.layout(config.layout.type, config.layout); } return stdoutAppender(layout, config.timezoneOffset); } exports.configure = configure;
function stdoutAppender(layout, timezoneOffset) { return (loggingEvent) => { process.stdout.write(`${layout(loggingEvent, timezoneOffset)}\n`); }; } function configure(config, layouts) { let layout = layouts.colouredLayout; if (config.layout) { layout = layouts.layout(config.layout.type, config.layout); } return stdoutAppender(layout, config.timezoneOffset); } exports.configure = configure;
-1
log4js-node/log4js-node
1,334
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately
lamweili
"2022-10-01T18:55:40Z"
"2022-10-01T18:57:33Z"
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
7ca308dfde78ffc3ece68b77e30107590c61dc12
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately.
./examples/logstashUDP.js
const log4js = require('../lib/log4js'); /* Sample logstash config: udp { codec => json port => 10001 queue_size => 2 workers => 2 type => myAppType } */ log4js.configure({ appenders: { console: { type: 'console', }, logstash: { host: '127.0.0.1', port: 10001, type: 'logstashUDP', logType: 'myAppType', // Optional, defaults to 'category' fields: { // Optional, will be added to the 'fields' object in logstash field1: 'value1', field2: 'value2', }, layout: { type: 'pattern', pattern: '%m', }, }, }, categories: { default: { appenders: ['console', 'logstash'], level: 'info' }, }, }); const logger = log4js.getLogger('myLogger'); logger.info('Test log message %s', 'arg1', 'arg2');
const log4js = require('../lib/log4js'); /* Sample logstash config: udp { codec => json port => 10001 queue_size => 2 workers => 2 type => myAppType } */ log4js.configure({ appenders: { console: { type: 'console', }, logstash: { host: '127.0.0.1', port: 10001, type: 'logstashUDP', logType: 'myAppType', // Optional, defaults to 'category' fields: { // Optional, will be added to the 'fields' object in logstash field1: 'value1', field2: 'value2', }, layout: { type: 'pattern', pattern: '%m', }, }, }, categories: { default: { appenders: ['console', 'logstash'], level: 'info' }, }, }); const logger = log4js.getLogger('myLogger'); logger.info('Test log message %s', 'arg1', 'arg2');
-1
log4js-node/log4js-node
1,334
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately
lamweili
"2022-10-01T18:55:40Z"
"2022-10-01T18:57:33Z"
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
7ca308dfde78ffc3ece68b77e30107590c61dc12
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately.
./docs/tcp.md
# TCP Appender The TCP appender sends log events to a master server over TCP sockets. It can be used as a simple way to centralise logging when you have multiple servers or processes. It uses the node.js core networking modules, and so does not require any extra dependencies. Remember to call `log4js.shutdown` when your application terminates, so that the sockets get closed cleanly. It's designed to work with the [tcp-server](tcp-server.md), but it doesn't necessarily have to, just make sure whatever is listening at the other end is expecting JSON objects as strings. ## Configuration - `type` - `tcp` - `port` - `integer` (optional, defaults to `5000`) - the port to send to - `host` - `string` (optional, defaults to `localhost`) - the host/IP address to send to - `endMsg` - `string` (optional, defaults to `__LOG4JS__`) - the delimiter that marks the end of a log message - `layout` - `object` (optional, defaults to a serialized log event) - see [layouts](layouts.md) ## Example ```javascript log4js.configure({ appenders: { network: { type: "tcp", host: "log.server" }, }, categories: { default: { appenders: ["network"], level: "error" }, }, }); ``` This will send all error messages to `log.server:5000`.
# TCP Appender The TCP appender sends log events to a master server over TCP sockets. It can be used as a simple way to centralise logging when you have multiple servers or processes. It uses the node.js core networking modules, and so does not require any extra dependencies. Remember to call `log4js.shutdown` when your application terminates, so that the sockets get closed cleanly. It's designed to work with the [tcp-server](tcp-server.md), but it doesn't necessarily have to, just make sure whatever is listening at the other end is expecting JSON objects as strings. ## Configuration - `type` - `tcp` - `port` - `integer` (optional, defaults to `5000`) - the port to send to - `host` - `string` (optional, defaults to `localhost`) - the host/IP address to send to - `endMsg` - `string` (optional, defaults to `__LOG4JS__`) - the delimiter that marks the end of a log message - `layout` - `object` (optional, defaults to a serialized log event) - see [layouts](layouts.md) ## Example ```javascript log4js.configure({ appenders: { network: { type: "tcp", host: "log.server" }, }, categories: { default: { appenders: ["network"], level: "error" }, }, }); ``` This will send all error messages to `log.server:5000`.
-1
log4js-node/log4js-node
1,334
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately
lamweili
"2022-10-01T18:55:40Z"
"2022-10-01T18:57:33Z"
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
7ca308dfde78ffc3ece68b77e30107590c61dc12
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately.
./docs/terms.md
## Terminology `Level` - a log level is the severity or priority of a log event (debug, info, etc). Whether an _appender_ will see the event or not is determined by the _category_'s level. If this is less than or equal to the event's level, it will be sent to the category's appender(s). `Category` - a label for grouping log events. This can be based on module (e.g. 'auth', 'payment', 'http'), or anything you like. Log events with the same _category_ will go to the same _appenders_. Log4js supports a hierarchy for categories, using dots to separate layers - for example, log events in the category 'myapp.submodule' will use the level for 'myapp' if none is defined for 'myapp.submodule', and also any appenders defined for 'myapp'. (This behaviour can be disabled by setting inherit=false on the sub-category.) The category for log events is defined when you get a _Logger_ from log4js (`log4js.getLogger('somecategory')`). `Appender` - appenders are responsible for output of log events. They may write events to files, send emails, store them in a database, or anything. Most appenders use _layouts_ to serialise the events to strings for output. `Logger` - this is your code's main interface with log4js. A logger instance may have an optional _category_, defined when you create the instance. Loggers provide the `info`, `debug`, `error`, etc functions that create _LogEvents_ and pass them on to appenders. `Layout` - a function for converting a _LogEvent_ into a string representation. Log4js comes with a few different implementations: basic, coloured, and a more configurable pattern based layout. `LogEvent` - a log event has a timestamp, a level, and optional category, data, and context properties. When you call `logger.info('cheese value:', edam)` the _logger_ will create a log event with the timestamp of now, a _level_ of INFO, a _category_ that was chosen when the logger was created, and a data array with two values (the string 'cheese value:', and the object 'edam'), along with any context data that was added to the logger.
## Terminology `Level` - a log level is the severity or priority of a log event (debug, info, etc). Whether an _appender_ will see the event or not is determined by the _category_'s level. If this is less than or equal to the event's level, it will be sent to the category's appender(s). `Category` - a label for grouping log events. This can be based on module (e.g. 'auth', 'payment', 'http'), or anything you like. Log events with the same _category_ will go to the same _appenders_. Log4js supports a hierarchy for categories, using dots to separate layers - for example, log events in the category 'myapp.submodule' will use the level for 'myapp' if none is defined for 'myapp.submodule', and also any appenders defined for 'myapp'. (This behaviour can be disabled by setting inherit=false on the sub-category.) The category for log events is defined when you get a _Logger_ from log4js (`log4js.getLogger('somecategory')`). `Appender` - appenders are responsible for output of log events. They may write events to files, send emails, store them in a database, or anything. Most appenders use _layouts_ to serialise the events to strings for output. `Logger` - this is your code's main interface with log4js. A logger instance may have an optional _category_, defined when you create the instance. Loggers provide the `info`, `debug`, `error`, etc functions that create _LogEvents_ and pass them on to appenders. `Layout` - a function for converting a _LogEvent_ into a string representation. Log4js comes with a few different implementations: basic, coloured, and a more configurable pattern based layout. `LogEvent` - a log event has a timestamp, a level, and optional category, data, and context properties. When you call `logger.info('cheese value:', edam)` the _logger_ will create a log event with the timestamp of now, a _level_ of INFO, a _category_ that was chosen when the logger was created, and a data array with two values (the string 'cheese value:', and the object 'edam'), along with any context data that was added to the logger.
-1
log4js-node/log4js-node
1,334
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately
lamweili
"2022-10-01T18:55:40Z"
"2022-10-01T18:57:33Z"
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
7ca308dfde78ffc3ece68b77e30107590c61dc12
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately.
./test/tap/disable-cluster-test.js
const { test } = require('tap'); const cluster = require('cluster'); const log4js = require('../../lib/log4js'); const recorder = require('../../lib/appenders/recording'); cluster.removeAllListeners(); log4js.configure({ appenders: { vcr: { type: 'recording' }, }, categories: { default: { appenders: ['vcr'], level: 'debug' } }, disableClustering: true, }); if (cluster.isMaster) { cluster.fork(); const masterLogger = log4js.getLogger('master'); const masterPid = process.pid; masterLogger.info('this is master'); cluster.on('exit', () => { const logEvents = recorder.replay(); test('cluster master', (batch) => { batch.test('only master events should be logged', (t) => { t.equal(logEvents.length, 1); t.equal(logEvents[0].categoryName, 'master'); t.equal(logEvents[0].pid, masterPid); t.equal(logEvents[0].data[0], 'this is master'); t.end(); }); batch.end(); }); }); } else { const workerLogger = log4js.getLogger('worker'); workerLogger.info('this is worker', new Error('oh dear')); const workerEvents = recorder.replay(); test('cluster worker', (batch) => { batch.test('should send events to its own appender', (t) => { t.equal(workerEvents.length, 1); t.equal(workerEvents[0].categoryName, 'worker'); t.equal(workerEvents[0].data[0], 'this is worker'); t.type(workerEvents[0].data[1], 'Error'); t.match(workerEvents[0].data[1].stack, 'Error: oh dear'); t.end(); }); batch.end(); }); // test sending a cluster-style log message process.send({ topic: 'log4js:message', data: { cheese: 'gouda' } }); cluster.worker.disconnect(); }
const { test } = require('tap'); const cluster = require('cluster'); const log4js = require('../../lib/log4js'); const recorder = require('../../lib/appenders/recording'); cluster.removeAllListeners(); log4js.configure({ appenders: { vcr: { type: 'recording' }, }, categories: { default: { appenders: ['vcr'], level: 'debug' } }, disableClustering: true, }); if (cluster.isMaster) { cluster.fork(); const masterLogger = log4js.getLogger('master'); const masterPid = process.pid; masterLogger.info('this is master'); cluster.on('exit', () => { const logEvents = recorder.replay(); test('cluster master', (batch) => { batch.test('only master events should be logged', (t) => { t.equal(logEvents.length, 1); t.equal(logEvents[0].categoryName, 'master'); t.equal(logEvents[0].pid, masterPid); t.equal(logEvents[0].data[0], 'this is master'); t.end(); }); batch.end(); }); }); } else { const workerLogger = log4js.getLogger('worker'); workerLogger.info('this is worker', new Error('oh dear')); const workerEvents = recorder.replay(); test('cluster worker', (batch) => { batch.test('should send events to its own appender', (t) => { t.equal(workerEvents.length, 1); t.equal(workerEvents[0].categoryName, 'worker'); t.equal(workerEvents[0].data[0], 'this is worker'); t.type(workerEvents[0].data[1], 'Error'); t.match(workerEvents[0].data[1].stack, 'Error: oh dear'); t.end(); }); batch.end(); }); // test sending a cluster-style log message process.send({ topic: 'log4js:message', data: { cheese: 'gouda' } }); cluster.worker.disconnect(); }
-1
log4js-node/log4js-node
1,334
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately
lamweili
"2022-10-01T18:55:40Z"
"2022-10-01T18:57:33Z"
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
7ca308dfde78ffc3ece68b77e30107590c61dc12
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately.
./test/tap/multi-file-appender-test.js
const process = require('process'); const { test } = require('tap'); const debug = require('debug'); const fs = require('fs'); const sandbox = require('@log4js-node/sandboxed-module'); const log4js = require('../../lib/log4js'); const osDelay = process.platform === 'win32' ? 400 : 200; const removeFiles = async (filenames) => { if (!Array.isArray(filenames)) filenames = [filenames]; const promises = filenames.map((filename) => fs.promises.unlink(filename)); await Promise.allSettled(promises); }; test('multiFile appender', (batch) => { batch.test( 'should write to multiple files based on the loggingEvent property', (t) => { t.teardown(async () => { await removeFiles(['logs/A.log', 'logs/B.log']); }); log4js.configure({ appenders: { multi: { type: 'multiFile', base: 'logs/', property: 'categoryName', extension: '.log', }, }, categories: { default: { appenders: ['multi'], level: 'info' } }, }); const loggerA = log4js.getLogger('A'); const loggerB = log4js.getLogger('B'); loggerA.info('I am in logger A'); loggerB.info('I am in logger B'); log4js.shutdown(() => { t.match(fs.readFileSync('logs/A.log', 'utf-8'), 'I am in logger A'); t.match(fs.readFileSync('logs/B.log', 'utf-8'), 'I am in logger B'); t.end(); }); } ); batch.test( 'should write to multiple files based on loggingEvent.context properties', (t) => { t.teardown(async () => { await removeFiles(['logs/C.log', 'logs/D.log']); }); log4js.configure({ appenders: { multi: { type: 'multiFile', base: 'logs/', property: 'label', extension: '.log', }, }, categories: { default: { appenders: ['multi'], level: 'info' } }, }); const loggerC = log4js.getLogger('cheese'); const loggerD = log4js.getLogger('biscuits'); loggerC.addContext('label', 'C'); loggerD.addContext('label', 'D'); loggerC.info('I am in logger C'); loggerD.info('I am in logger D'); log4js.shutdown(() => { t.match(fs.readFileSync('logs/C.log', 'utf-8'), 'I am in logger C'); t.match(fs.readFileSync('logs/D.log', 'utf-8'), 'I am in logger D'); t.end(); }); } ); batch.test('should close file after timeout', (t) => { /* checking that the file is closed after a timeout is done by looking at the debug logs since detecting file locks with node.js is platform specific. */ const debugWasEnabled = debug.enabled('log4js:multiFile'); const debugLogs = []; const originalWrite = process.stderr.write; process.stderr.write = (string, encoding, fd) => { debugLogs.push(string); if (debugWasEnabled) { originalWrite.apply(process.stderr, [string, encoding, fd]); } }; const originalNamespace = debug.disable(); debug.enable(`${originalNamespace}, log4js:multiFile`); t.teardown(async () => { await new Promise((resolve) => { log4js.shutdown(resolve); }); await removeFiles('logs/C.log'); process.stderr.write = originalWrite; debug.enable(originalNamespace); }); const timeoutMs = 50; log4js.configure({ appenders: { multi: { type: 'multiFile', base: 'logs/', property: 'label', extension: '.log', timeout: timeoutMs, }, }, categories: { default: { appenders: ['multi'], level: 'info' } }, }); const loggerC = log4js.getLogger('cheese'); loggerC.addContext('label', 'C'); loggerC.info('I am in logger C'); setTimeout(() => { t.match( debugLogs[debugLogs.length - 1], `C not used for > ${timeoutMs} ms => close`, '(timeout1) should have closed' ); t.end(); }, timeoutMs * 1 + osDelay); }); batch.test('should close file safely after timeout', (t) => { const error = new Error('fileAppender shutdown error'); const sandboxedLog4js = sandbox.require('../../lib/log4js', { requires: { './appenders/file': { configure(config, layouts) { const fileAppender = require('../../lib/appenders/file').configure( config, layouts ); const originalShutdown = fileAppender.shutdown; fileAppender.shutdown = function (complete) { const onCallback = function () { complete(error); }; originalShutdown(onCallback); }; return fileAppender; }, }, debug, }, }); /* checking that the file is closed after a timeout is done by looking at the debug logs since detecting file locks with node.js is platform specific. */ const debugWasEnabled = debug.enabled('log4js:multiFile'); const debugLogs = []; const originalWrite = process.stderr.write; process.stderr.write = (string, encoding, fd) => { debugLogs.push(string); if (debugWasEnabled) { originalWrite.apply(process.stderr, [string, encoding, fd]); } }; const originalNamespace = debug.disable(); debug.enable(`${originalNamespace}, log4js:multiFile`); t.teardown(async () => { await new Promise((resolve) => { sandboxedLog4js.shutdown(resolve); }); await removeFiles('logs/C.log'); process.stderr.write = originalWrite; debug.enable(originalNamespace); }); const timeoutMs = 50; sandboxedLog4js.configure({ appenders: { multi: { type: 'multiFile', base: 'logs/', property: 'label', extension: '.log', timeout: timeoutMs, }, }, categories: { default: { appenders: ['multi'], level: 'info' } }, }); const loggerC = sandboxedLog4js.getLogger('cheese'); loggerC.addContext('label', 'C'); loggerC.info('I am in logger C'); setTimeout(() => { t.match( debugLogs[debugLogs.length - 2], `C not used for > ${timeoutMs} ms => close`, '(timeout1) should have closed' ); t.match( debugLogs[debugLogs.length - 1], `ignore error on file shutdown: ${error.message}`, 'safely shutdown' ); t.end(); }, timeoutMs * 1 + osDelay); }); batch.test('should close file after extended timeout', (t) => { /* checking that the file is closed after a timeout is done by looking at the debug logs since detecting file locks with node.js is platform specific. */ const debugWasEnabled = debug.enabled('log4js:multiFile'); const debugLogs = []; const originalWrite = process.stderr.write; process.stderr.write = (string, encoding, fd) => { debugLogs.push(string); if (debugWasEnabled) { originalWrite.apply(process.stderr, [string, encoding, fd]); } }; const originalNamespace = debug.disable(); debug.enable(`${originalNamespace}, log4js:multiFile`); t.teardown(async () => { await new Promise((resolve) => { log4js.shutdown(resolve); }); await removeFiles('logs/D.log'); process.stderr.write = originalWrite; debug.enable(originalNamespace); }); const timeoutMs = 1000; log4js.configure({ appenders: { multi: { type: 'multiFile', base: 'logs/', property: 'label', extension: '.log', timeout: timeoutMs, }, }, categories: { default: { appenders: ['multi'], level: 'info' } }, }); const loggerD = log4js.getLogger('cheese'); loggerD.addContext('label', 'D'); loggerD.info('I am in logger D'); setTimeout(() => { loggerD.info('extending activity!'); t.match( debugLogs[debugLogs.length - 1], 'D extending activity', 'should have extended' ); }, timeoutMs / 2); setTimeout(() => { t.notOk( debugLogs.some( (s) => s.indexOf(`D not used for > ${timeoutMs} ms => close`) !== -1 ), '(timeout1) should not have closed' ); }, timeoutMs * 1 + osDelay); setTimeout(() => { t.match( debugLogs[debugLogs.length - 1], `D not used for > ${timeoutMs} ms => close`, '(timeout2) should have closed' ); t.end(); }, timeoutMs * 2 + osDelay); }); batch.test('should clear interval for active timers on shutdown', (t) => { /* checking that the file is closed after a timeout is done by looking at the debug logs since detecting file locks with node.js is platform specific. */ const debugWasEnabled = debug.enabled('log4js:multiFile'); const debugLogs = []; const originalWrite = process.stderr.write; process.stderr.write = (string, encoding, fd) => { debugLogs.push(string); if (debugWasEnabled) { originalWrite.apply(process.stderr, [string, encoding, fd]); } }; const originalNamespace = debug.disable(); debug.enable(`${originalNamespace}, log4js:multiFile`); t.teardown(async () => { await removeFiles('logs/D.log'); process.stderr.write = originalWrite; debug.enable(originalNamespace); }); const timeoutMs = 100; log4js.configure({ appenders: { multi: { type: 'multiFile', base: 'logs/', property: 'label', extension: '.log', timeout: timeoutMs, }, }, categories: { default: { appenders: ['multi'], level: 'info' } }, }); const loggerD = log4js.getLogger('cheese'); loggerD.addContext('label', 'D'); loggerD.info('I am in logger D'); log4js.shutdown(() => { t.notOk( debugLogs.some( (s) => s.indexOf(`D not used for > ${timeoutMs} ms => close`) !== -1 ), 'should not have closed' ); t.ok( debugLogs.some((s) => s.indexOf('clearing timer for D') !== -1), 'should have cleared timers' ); t.match( debugLogs[debugLogs.length - 1], 'calling shutdown for D', 'should have called shutdown' ); t.end(); }); }); batch.test( 'should fail silently if loggingEvent property has no value', (t) => { t.teardown(async () => { await removeFiles('logs/E.log'); }); log4js.configure({ appenders: { multi: { type: 'multiFile', base: 'logs/', property: 'label', extension: '.log', }, }, categories: { default: { appenders: ['multi'], level: 'info' } }, }); const loggerE = log4js.getLogger(); loggerE.addContext('label', 'E'); loggerE.info('I am in logger E'); loggerE.removeContext('label'); loggerE.info('I am not in logger E'); loggerE.addContext('label', null); loggerE.info('I am also not in logger E'); log4js.shutdown(() => { const contents = fs.readFileSync('logs/E.log', 'utf-8'); t.match(contents, 'I am in logger E'); t.notMatch(contents, 'I am not in logger E'); t.notMatch(contents, 'I am also not in logger E'); t.end(); }); } ); batch.test('should pass options to rolling file stream', (t) => { t.teardown(async () => { await removeFiles(['logs/F.log', 'logs/F.log.1', 'logs/F.log.2']); }); log4js.configure({ appenders: { multi: { type: 'multiFile', base: 'logs/', property: 'label', extension: '.log', maxLogSize: 30, backups: 2, layout: { type: 'messagePassThrough' }, }, }, categories: { default: { appenders: ['multi'], level: 'info' } }, }); const loggerF = log4js.getLogger(); loggerF.addContext('label', 'F'); loggerF.info('Being in logger F is the best.'); loggerF.info('I am also in logger F, awesome'); loggerF.info('I am in logger F'); log4js.shutdown(() => { let contents = fs.readFileSync('logs/F.log', 'utf-8'); t.match(contents, 'I am in logger F'); contents = fs.readFileSync('logs/F.log.1', 'utf-8'); t.match(contents, 'I am also in logger F'); contents = fs.readFileSync('logs/F.log.2', 'utf-8'); t.match(contents, 'Being in logger F is the best'); t.end(); }); }); batch.test('should inherit config from category hierarchy', (t) => { t.teardown(async () => { await removeFiles('logs/test.someTest.log'); }); log4js.configure({ appenders: { out: { type: 'stdout' }, test: { type: 'multiFile', base: 'logs/', property: 'categoryName', extension: '.log', }, }, categories: { default: { appenders: ['out'], level: 'info' }, test: { appenders: ['test'], level: 'debug' }, }, }); const testLogger = log4js.getLogger('test.someTest'); testLogger.debug('This should go to the file'); log4js.shutdown(() => { const contents = fs.readFileSync('logs/test.someTest.log', 'utf-8'); t.match(contents, 'This should go to the file'); t.end(); }); }); batch.test('should shutdown safely even if it is not used', (t) => { log4js.configure({ appenders: { out: { type: 'stdout' }, test: { type: 'multiFile', base: 'logs/', property: 'categoryName', extension: '.log', }, }, categories: { default: { appenders: ['out'], level: 'info' }, test: { appenders: ['test'], level: 'debug' }, }, }); log4js.shutdown(() => { t.ok('callback is called'); t.end(); }); }); batch.teardown(async () => { try { const files = fs.readdirSync('logs'); await removeFiles(files.map((filename) => `logs/${filename}`)); fs.rmdirSync('logs'); } catch (e) { // doesn't matter } }); batch.end(); });
const process = require('process'); const { test } = require('tap'); const debug = require('debug'); const fs = require('fs'); const sandbox = require('@log4js-node/sandboxed-module'); const log4js = require('../../lib/log4js'); const osDelay = process.platform === 'win32' ? 400 : 200; const removeFiles = async (filenames) => { if (!Array.isArray(filenames)) filenames = [filenames]; const promises = filenames.map((filename) => fs.promises.unlink(filename)); await Promise.allSettled(promises); }; test('multiFile appender', (batch) => { batch.test( 'should write to multiple files based on the loggingEvent property', (t) => { t.teardown(async () => { await removeFiles(['logs/A.log', 'logs/B.log']); }); log4js.configure({ appenders: { multi: { type: 'multiFile', base: 'logs/', property: 'categoryName', extension: '.log', }, }, categories: { default: { appenders: ['multi'], level: 'info' } }, }); const loggerA = log4js.getLogger('A'); const loggerB = log4js.getLogger('B'); loggerA.info('I am in logger A'); loggerB.info('I am in logger B'); log4js.shutdown(() => { t.match(fs.readFileSync('logs/A.log', 'utf-8'), 'I am in logger A'); t.match(fs.readFileSync('logs/B.log', 'utf-8'), 'I am in logger B'); t.end(); }); } ); batch.test( 'should write to multiple files based on loggingEvent.context properties', (t) => { t.teardown(async () => { await removeFiles(['logs/C.log', 'logs/D.log']); }); log4js.configure({ appenders: { multi: { type: 'multiFile', base: 'logs/', property: 'label', extension: '.log', }, }, categories: { default: { appenders: ['multi'], level: 'info' } }, }); const loggerC = log4js.getLogger('cheese'); const loggerD = log4js.getLogger('biscuits'); loggerC.addContext('label', 'C'); loggerD.addContext('label', 'D'); loggerC.info('I am in logger C'); loggerD.info('I am in logger D'); log4js.shutdown(() => { t.match(fs.readFileSync('logs/C.log', 'utf-8'), 'I am in logger C'); t.match(fs.readFileSync('logs/D.log', 'utf-8'), 'I am in logger D'); t.end(); }); } ); batch.test('should close file after timeout', (t) => { /* checking that the file is closed after a timeout is done by looking at the debug logs since detecting file locks with node.js is platform specific. */ const debugWasEnabled = debug.enabled('log4js:multiFile'); const debugLogs = []; const originalWrite = process.stderr.write; process.stderr.write = (string, encoding, fd) => { debugLogs.push(string); if (debugWasEnabled) { originalWrite.apply(process.stderr, [string, encoding, fd]); } }; const originalNamespace = debug.disable(); debug.enable(`${originalNamespace}, log4js:multiFile`); t.teardown(async () => { await new Promise((resolve) => { log4js.shutdown(resolve); }); await removeFiles('logs/C.log'); process.stderr.write = originalWrite; debug.enable(originalNamespace); }); const timeoutMs = 50; log4js.configure({ appenders: { multi: { type: 'multiFile', base: 'logs/', property: 'label', extension: '.log', timeout: timeoutMs, }, }, categories: { default: { appenders: ['multi'], level: 'info' } }, }); const loggerC = log4js.getLogger('cheese'); loggerC.addContext('label', 'C'); loggerC.info('I am in logger C'); setTimeout(() => { t.match( debugLogs[debugLogs.length - 1], `C not used for > ${timeoutMs} ms => close`, '(timeout1) should have closed' ); t.end(); }, timeoutMs * 1 + osDelay); }); batch.test('should close file safely after timeout', (t) => { const error = new Error('fileAppender shutdown error'); const sandboxedLog4js = sandbox.require('../../lib/log4js', { requires: { './appenders/file': { configure(config, layouts) { const fileAppender = require('../../lib/appenders/file').configure( config, layouts ); const originalShutdown = fileAppender.shutdown; fileAppender.shutdown = function (complete) { const onCallback = function () { complete(error); }; originalShutdown(onCallback); }; return fileAppender; }, }, debug, }, }); /* checking that the file is closed after a timeout is done by looking at the debug logs since detecting file locks with node.js is platform specific. */ const debugWasEnabled = debug.enabled('log4js:multiFile'); const debugLogs = []; const originalWrite = process.stderr.write; process.stderr.write = (string, encoding, fd) => { debugLogs.push(string); if (debugWasEnabled) { originalWrite.apply(process.stderr, [string, encoding, fd]); } }; const originalNamespace = debug.disable(); debug.enable(`${originalNamespace}, log4js:multiFile`); t.teardown(async () => { await new Promise((resolve) => { sandboxedLog4js.shutdown(resolve); }); await removeFiles('logs/C.log'); process.stderr.write = originalWrite; debug.enable(originalNamespace); }); const timeoutMs = 50; sandboxedLog4js.configure({ appenders: { multi: { type: 'multiFile', base: 'logs/', property: 'label', extension: '.log', timeout: timeoutMs, }, }, categories: { default: { appenders: ['multi'], level: 'info' } }, }); const loggerC = sandboxedLog4js.getLogger('cheese'); loggerC.addContext('label', 'C'); loggerC.info('I am in logger C'); setTimeout(() => { t.match( debugLogs[debugLogs.length - 2], `C not used for > ${timeoutMs} ms => close`, '(timeout1) should have closed' ); t.match( debugLogs[debugLogs.length - 1], `ignore error on file shutdown: ${error.message}`, 'safely shutdown' ); t.end(); }, timeoutMs * 1 + osDelay); }); batch.test('should close file after extended timeout', (t) => { /* checking that the file is closed after a timeout is done by looking at the debug logs since detecting file locks with node.js is platform specific. */ const debugWasEnabled = debug.enabled('log4js:multiFile'); const debugLogs = []; const originalWrite = process.stderr.write; process.stderr.write = (string, encoding, fd) => { debugLogs.push(string); if (debugWasEnabled) { originalWrite.apply(process.stderr, [string, encoding, fd]); } }; const originalNamespace = debug.disable(); debug.enable(`${originalNamespace}, log4js:multiFile`); t.teardown(async () => { await new Promise((resolve) => { log4js.shutdown(resolve); }); await removeFiles('logs/D.log'); process.stderr.write = originalWrite; debug.enable(originalNamespace); }); const timeoutMs = 1000; log4js.configure({ appenders: { multi: { type: 'multiFile', base: 'logs/', property: 'label', extension: '.log', timeout: timeoutMs, }, }, categories: { default: { appenders: ['multi'], level: 'info' } }, }); const loggerD = log4js.getLogger('cheese'); loggerD.addContext('label', 'D'); loggerD.info('I am in logger D'); setTimeout(() => { loggerD.info('extending activity!'); t.match( debugLogs[debugLogs.length - 1], 'D extending activity', 'should have extended' ); }, timeoutMs / 2); setTimeout(() => { t.notOk( debugLogs.some( (s) => s.indexOf(`D not used for > ${timeoutMs} ms => close`) !== -1 ), '(timeout1) should not have closed' ); }, timeoutMs * 1 + osDelay); setTimeout(() => { t.match( debugLogs[debugLogs.length - 1], `D not used for > ${timeoutMs} ms => close`, '(timeout2) should have closed' ); t.end(); }, timeoutMs * 2 + osDelay); }); batch.test('should clear interval for active timers on shutdown', (t) => { /* checking that the file is closed after a timeout is done by looking at the debug logs since detecting file locks with node.js is platform specific. */ const debugWasEnabled = debug.enabled('log4js:multiFile'); const debugLogs = []; const originalWrite = process.stderr.write; process.stderr.write = (string, encoding, fd) => { debugLogs.push(string); if (debugWasEnabled) { originalWrite.apply(process.stderr, [string, encoding, fd]); } }; const originalNamespace = debug.disable(); debug.enable(`${originalNamespace}, log4js:multiFile`); t.teardown(async () => { await removeFiles('logs/D.log'); process.stderr.write = originalWrite; debug.enable(originalNamespace); }); const timeoutMs = 100; log4js.configure({ appenders: { multi: { type: 'multiFile', base: 'logs/', property: 'label', extension: '.log', timeout: timeoutMs, }, }, categories: { default: { appenders: ['multi'], level: 'info' } }, }); const loggerD = log4js.getLogger('cheese'); loggerD.addContext('label', 'D'); loggerD.info('I am in logger D'); log4js.shutdown(() => { t.notOk( debugLogs.some( (s) => s.indexOf(`D not used for > ${timeoutMs} ms => close`) !== -1 ), 'should not have closed' ); t.ok( debugLogs.some((s) => s.indexOf('clearing timer for D') !== -1), 'should have cleared timers' ); t.match( debugLogs[debugLogs.length - 1], 'calling shutdown for D', 'should have called shutdown' ); t.end(); }); }); batch.test( 'should fail silently if loggingEvent property has no value', (t) => { t.teardown(async () => { await removeFiles('logs/E.log'); }); log4js.configure({ appenders: { multi: { type: 'multiFile', base: 'logs/', property: 'label', extension: '.log', }, }, categories: { default: { appenders: ['multi'], level: 'info' } }, }); const loggerE = log4js.getLogger(); loggerE.addContext('label', 'E'); loggerE.info('I am in logger E'); loggerE.removeContext('label'); loggerE.info('I am not in logger E'); loggerE.addContext('label', null); loggerE.info('I am also not in logger E'); log4js.shutdown(() => { const contents = fs.readFileSync('logs/E.log', 'utf-8'); t.match(contents, 'I am in logger E'); t.notMatch(contents, 'I am not in logger E'); t.notMatch(contents, 'I am also not in logger E'); t.end(); }); } ); batch.test('should pass options to rolling file stream', (t) => { t.teardown(async () => { await removeFiles(['logs/F.log', 'logs/F.log.1', 'logs/F.log.2']); }); log4js.configure({ appenders: { multi: { type: 'multiFile', base: 'logs/', property: 'label', extension: '.log', maxLogSize: 30, backups: 2, layout: { type: 'messagePassThrough' }, }, }, categories: { default: { appenders: ['multi'], level: 'info' } }, }); const loggerF = log4js.getLogger(); loggerF.addContext('label', 'F'); loggerF.info('Being in logger F is the best.'); loggerF.info('I am also in logger F, awesome'); loggerF.info('I am in logger F'); log4js.shutdown(() => { let contents = fs.readFileSync('logs/F.log', 'utf-8'); t.match(contents, 'I am in logger F'); contents = fs.readFileSync('logs/F.log.1', 'utf-8'); t.match(contents, 'I am also in logger F'); contents = fs.readFileSync('logs/F.log.2', 'utf-8'); t.match(contents, 'Being in logger F is the best'); t.end(); }); }); batch.test('should inherit config from category hierarchy', (t) => { t.teardown(async () => { await removeFiles('logs/test.someTest.log'); }); log4js.configure({ appenders: { out: { type: 'stdout' }, test: { type: 'multiFile', base: 'logs/', property: 'categoryName', extension: '.log', }, }, categories: { default: { appenders: ['out'], level: 'info' }, test: { appenders: ['test'], level: 'debug' }, }, }); const testLogger = log4js.getLogger('test.someTest'); testLogger.debug('This should go to the file'); log4js.shutdown(() => { const contents = fs.readFileSync('logs/test.someTest.log', 'utf-8'); t.match(contents, 'This should go to the file'); t.end(); }); }); batch.test('should shutdown safely even if it is not used', (t) => { log4js.configure({ appenders: { out: { type: 'stdout' }, test: { type: 'multiFile', base: 'logs/', property: 'categoryName', extension: '.log', }, }, categories: { default: { appenders: ['out'], level: 'info' }, test: { appenders: ['test'], level: 'debug' }, }, }); log4js.shutdown(() => { t.ok('callback is called'); t.end(); }); }); batch.teardown(async () => { try { const files = fs.readdirSync('logs'); await removeFiles(files.map((filename) => `logs/${filename}`)); fs.rmdirSync('logs'); } catch (e) { // doesn't matter } }); batch.end(); });
-1
log4js-node/log4js-node
1,334
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately
lamweili
"2022-10-01T18:55:40Z"
"2022-10-01T18:57:33Z"
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
7ca308dfde78ffc3ece68b77e30107590c61dc12
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately.
./test/tap/noLogFilter-test.js
const { test } = require('tap'); const log4js = require('../../lib/log4js'); const recording = require('../../lib/appenders/recording'); /** * test a simple regexp */ test('log4js noLogFilter', (batch) => { batch.beforeEach((done) => { recording.reset(); if (typeof done === 'function') { done(); } }); batch.test( 'appender should exclude events that match the regexp string', (t) => { log4js.configure({ appenders: { recorder: { type: 'recording' }, filtered: { type: 'noLogFilter', exclude: 'This.*not', appender: 'recorder', }, }, categories: { default: { appenders: ['filtered'], level: 'DEBUG' } }, }); const logger = log4js.getLogger(); logger.debug('This should not get logged'); logger.debug('This should get logged'); logger.debug( 'Another case that not match the regex, so it should get logged' ); const logEvents = recording.replay(); t.equal(logEvents.length, 2); t.equal(logEvents[0].data[0], 'This should get logged'); t.equal( logEvents[1].data[0], 'Another case that not match the regex, so it should get logged' ); t.end(); } ); /** * test an array of regexp */ batch.test( 'appender should exclude events that match the regexp string contained in the array', (t) => { log4js.configure({ appenders: { recorder: { type: 'recording' }, filtered: { type: 'noLogFilter', exclude: ['This.*not', 'instead'], appender: 'recorder', }, }, categories: { default: { appenders: ['filtered'], level: 'DEBUG' } }, }); const logger = log4js.getLogger(); logger.debug('This should not get logged'); logger.debug('This should get logged'); logger.debug( 'Another case that not match the regex, so it should get logged' ); logger.debug('This case instead it should get logged'); logger.debug('The last that should get logged'); const logEvents = recording.replay(); t.equal(logEvents.length, 3); t.equal(logEvents[0].data[0], 'This should get logged'); t.equal( logEvents[1].data[0], 'Another case that not match the regex, so it should get logged' ); t.equal(logEvents[2].data[0], 'The last that should get logged'); t.end(); } ); /** * test case insentitive regexp */ batch.test( 'appender should evaluate the regexp using incase sentitive option', (t) => { log4js.configure({ appenders: { recorder: { type: 'recording' }, filtered: { type: 'noLogFilter', exclude: ['NOT', 'eX.*de'], appender: 'recorder', }, }, categories: { default: { appenders: ['filtered'], level: 'DEBUG' } }, }); const logger = log4js.getLogger(); logger.debug('This should not get logged'); logger.debug('This should get logged'); logger.debug('Exclude this string'); logger.debug('Include this string'); const logEvents = recording.replay(); t.equal(logEvents.length, 2); t.equal(logEvents[0].data[0], 'This should get logged'); t.equal(logEvents[1].data[0], 'Include this string'); t.end(); } ); /** * test empty string or null regexp */ batch.test( 'appender should skip the match in case of empty or null regexp', (t) => { log4js.configure({ appenders: { recorder: { type: 'recording' }, filtered: { type: 'noLogFilter', exclude: ['', null, undefined], appender: 'recorder', }, }, categories: { default: { appenders: ['filtered'], level: 'DEBUG' } }, }); const logger = log4js.getLogger(); logger.debug('This should get logged'); logger.debug('Another string that should get logged'); const logEvents = recording.replay(); t.equal(logEvents.length, 2); t.equal(logEvents[0].data[0], 'This should get logged'); t.equal(logEvents[1].data[0], 'Another string that should get logged'); t.end(); } ); /** * test for excluding all the events that contains digits */ batch.test('appender should exclude the events that contains digits', (t) => { log4js.configure({ appenders: { recorder: { type: 'recording' }, filtered: { type: 'noLogFilter', exclude: '\\d', appender: 'recorder', }, }, categories: { default: { appenders: ['filtered'], level: 'DEBUG' } }, }); const logger = log4js.getLogger(); logger.debug('This should get logged'); logger.debug('The 2nd event should not get logged'); logger.debug('The 3rd event should not get logged, such as the 2nd'); const logEvents = recording.replay(); t.equal(logEvents.length, 1); t.equal(logEvents[0].data[0], 'This should get logged'); t.end(); }); /** * test the cases provided in the documentation * https://log4js-node.github.io/log4js-node/noLogFilter.html */ batch.test( 'appender should exclude not valid events according to the documentation', (t) => { log4js.configure({ appenders: { recorder: { type: 'recording' }, filtered: { type: 'noLogFilter', exclude: ['NOT', '\\d', ''], appender: 'recorder', }, }, categories: { default: { appenders: ['filtered'], level: 'DEBUG' } }, }); const logger = log4js.getLogger(); logger.debug('I will be logged in all-the-logs.log'); logger.debug('I will be not logged in all-the-logs.log'); logger.debug('A 2nd message that will be excluded in all-the-logs.log'); logger.debug('Hello again'); const logEvents = recording.replay(); t.equal(logEvents.length, 2); t.equal(logEvents[0].data[0], 'I will be logged in all-the-logs.log'); t.equal(logEvents[1].data[0], 'Hello again'); t.end(); } ); batch.end(); });
const { test } = require('tap'); const log4js = require('../../lib/log4js'); const recording = require('../../lib/appenders/recording'); /** * test a simple regexp */ test('log4js noLogFilter', (batch) => { batch.beforeEach((done) => { recording.reset(); if (typeof done === 'function') { done(); } }); batch.test( 'appender should exclude events that match the regexp string', (t) => { log4js.configure({ appenders: { recorder: { type: 'recording' }, filtered: { type: 'noLogFilter', exclude: 'This.*not', appender: 'recorder', }, }, categories: { default: { appenders: ['filtered'], level: 'DEBUG' } }, }); const logger = log4js.getLogger(); logger.debug('This should not get logged'); logger.debug('This should get logged'); logger.debug( 'Another case that not match the regex, so it should get logged' ); const logEvents = recording.replay(); t.equal(logEvents.length, 2); t.equal(logEvents[0].data[0], 'This should get logged'); t.equal( logEvents[1].data[0], 'Another case that not match the regex, so it should get logged' ); t.end(); } ); /** * test an array of regexp */ batch.test( 'appender should exclude events that match the regexp string contained in the array', (t) => { log4js.configure({ appenders: { recorder: { type: 'recording' }, filtered: { type: 'noLogFilter', exclude: ['This.*not', 'instead'], appender: 'recorder', }, }, categories: { default: { appenders: ['filtered'], level: 'DEBUG' } }, }); const logger = log4js.getLogger(); logger.debug('This should not get logged'); logger.debug('This should get logged'); logger.debug( 'Another case that not match the regex, so it should get logged' ); logger.debug('This case instead it should get logged'); logger.debug('The last that should get logged'); const logEvents = recording.replay(); t.equal(logEvents.length, 3); t.equal(logEvents[0].data[0], 'This should get logged'); t.equal( logEvents[1].data[0], 'Another case that not match the regex, so it should get logged' ); t.equal(logEvents[2].data[0], 'The last that should get logged'); t.end(); } ); /** * test case insentitive regexp */ batch.test( 'appender should evaluate the regexp using incase sentitive option', (t) => { log4js.configure({ appenders: { recorder: { type: 'recording' }, filtered: { type: 'noLogFilter', exclude: ['NOT', 'eX.*de'], appender: 'recorder', }, }, categories: { default: { appenders: ['filtered'], level: 'DEBUG' } }, }); const logger = log4js.getLogger(); logger.debug('This should not get logged'); logger.debug('This should get logged'); logger.debug('Exclude this string'); logger.debug('Include this string'); const logEvents = recording.replay(); t.equal(logEvents.length, 2); t.equal(logEvents[0].data[0], 'This should get logged'); t.equal(logEvents[1].data[0], 'Include this string'); t.end(); } ); /** * test empty string or null regexp */ batch.test( 'appender should skip the match in case of empty or null regexp', (t) => { log4js.configure({ appenders: { recorder: { type: 'recording' }, filtered: { type: 'noLogFilter', exclude: ['', null, undefined], appender: 'recorder', }, }, categories: { default: { appenders: ['filtered'], level: 'DEBUG' } }, }); const logger = log4js.getLogger(); logger.debug('This should get logged'); logger.debug('Another string that should get logged'); const logEvents = recording.replay(); t.equal(logEvents.length, 2); t.equal(logEvents[0].data[0], 'This should get logged'); t.equal(logEvents[1].data[0], 'Another string that should get logged'); t.end(); } ); /** * test for excluding all the events that contains digits */ batch.test('appender should exclude the events that contains digits', (t) => { log4js.configure({ appenders: { recorder: { type: 'recording' }, filtered: { type: 'noLogFilter', exclude: '\\d', appender: 'recorder', }, }, categories: { default: { appenders: ['filtered'], level: 'DEBUG' } }, }); const logger = log4js.getLogger(); logger.debug('This should get logged'); logger.debug('The 2nd event should not get logged'); logger.debug('The 3rd event should not get logged, such as the 2nd'); const logEvents = recording.replay(); t.equal(logEvents.length, 1); t.equal(logEvents[0].data[0], 'This should get logged'); t.end(); }); /** * test the cases provided in the documentation * https://log4js-node.github.io/log4js-node/noLogFilter.html */ batch.test( 'appender should exclude not valid events according to the documentation', (t) => { log4js.configure({ appenders: { recorder: { type: 'recording' }, filtered: { type: 'noLogFilter', exclude: ['NOT', '\\d', ''], appender: 'recorder', }, }, categories: { default: { appenders: ['filtered'], level: 'DEBUG' } }, }); const logger = log4js.getLogger(); logger.debug('I will be logged in all-the-logs.log'); logger.debug('I will be not logged in all-the-logs.log'); logger.debug('A 2nd message that will be excluded in all-the-logs.log'); logger.debug('Hello again'); const logEvents = recording.replay(); t.equal(logEvents.length, 2); t.equal(logEvents[0].data[0], 'I will be logged in all-the-logs.log'); t.equal(logEvents[1].data[0], 'Hello again'); t.end(); } ); batch.end(); });
-1
log4js-node/log4js-node
1,334
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately
lamweili
"2022-10-01T18:55:40Z"
"2022-10-01T18:57:33Z"
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
7ca308dfde78ffc3ece68b77e30107590c61dc12
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately.
./examples/loggly-appender.js
// Note that loggly appender needs node-loggly to work. // If you haven't got node-loggly installed, you'll get cryptic // "cannot find module" errors when using the loggly appender const log4js = require('../lib/log4js'); log4js.configure({ appenders: { console: { type: 'console', }, loggly: { type: 'loggly', token: '12345678901234567890', subdomain: 'your-subdomain', tags: ['test'], }, }, categories: { default: { appenders: ['console'], level: 'info' }, loggly: { appenders: ['loggly'], level: 'info' }, }, }); const logger = log4js.getLogger('loggly'); logger.info('Test log message'); // logger.debug("Test log message");
// Note that loggly appender needs node-loggly to work. // If you haven't got node-loggly installed, you'll get cryptic // "cannot find module" errors when using the loggly appender const log4js = require('../lib/log4js'); log4js.configure({ appenders: { console: { type: 'console', }, loggly: { type: 'loggly', token: '12345678901234567890', subdomain: 'your-subdomain', tags: ['test'], }, }, categories: { default: { appenders: ['console'], level: 'info' }, loggly: { appenders: ['loggly'], level: 'info' }, }, }); const logger = log4js.getLogger('loggly'); logger.info('Test log message'); // logger.debug("Test log message");
-1
log4js-node/log4js-node
1,334
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately
lamweili
"2022-10-01T18:55:40Z"
"2022-10-01T18:57:33Z"
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
7ca308dfde78ffc3ece68b77e30107590c61dc12
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately.
./docs/_config.yml
theme: jekyll-theme-minimal repository: nomiddlename/log4js-node
theme: jekyll-theme-minimal repository: nomiddlename/log4js-node
-1
log4js-node/log4js-node
1,334
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately
lamweili
"2022-10-01T18:55:40Z"
"2022-10-01T18:57:33Z"
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
7ca308dfde78ffc3ece68b77e30107590c61dc12
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately.
./test/tap/configuration-validation-test.js
const { test } = require('tap'); const util = require('util'); const path = require('path'); const sandbox = require('@log4js-node/sandboxed-module'); const debug = require('debug')('log4js:test.configuration-validation'); const deepFreeze = require('deep-freeze'); const fs = require('fs'); const log4js = require('../../lib/log4js'); const configuration = require('../../lib/configuration'); const removeFiles = async (filenames) => { if (!Array.isArray(filenames)) filenames = [filenames]; const promises = filenames.map((filename) => fs.promises.unlink(filename)); await Promise.allSettled(promises); }; const testAppender = (label, result) => ({ configure(config, layouts, findAppender) { debug( `testAppender(${label}).configure called, with config: ${util.inspect( config )}` ); result.configureCalled = true; result.type = config.type; result.label = label; result.config = config; result.layouts = layouts; result.findAppender = findAppender; return {}; }, }); test('log4js configuration validation', (batch) => { batch.test('should give error if config is just plain silly', (t) => { [null, undefined, '', ' ', []].forEach((config) => { const expectedError = new Error( `Problem with log4js configuration: (${util.inspect( config )}) - must be an object.` ); t.throws(() => configuration.configure(config), expectedError); }); t.end(); }); batch.test('should give error if config is an empty object', (t) => { t.throws( () => log4js.configure({}), '- must have a property "appenders" of type object.' ); t.end(); }); batch.test('should give error if config has no appenders', (t) => { t.throws( () => log4js.configure({ categories: {} }), '- must have a property "appenders" of type object.' ); t.end(); }); batch.test('should give error if config has no categories', (t) => { t.throws( () => log4js.configure({ appenders: { out: { type: 'stdout' } } }), '- must have a property "categories" of type object.' ); t.end(); }); batch.test('should give error if appenders is not an object', (t) => { t.throws( () => log4js.configure({ appenders: [], categories: [] }), '- must have a property "appenders" of type object.' ); t.end(); }); batch.test('should give error if appenders are not all valid', (t) => { t.throws( () => log4js.configure({ appenders: { thing: 'cheese' }, categories: {} }), '- appender "thing" is not valid (must be an object with property "type")' ); t.end(); }); batch.test('should require at least one appender', (t) => { t.throws( () => log4js.configure({ appenders: {}, categories: {} }), '- must define at least one appender.' ); t.end(); }); batch.test('should give error if categories are not all valid', (t) => { t.throws( () => log4js.configure({ appenders: { stdout: { type: 'stdout' } }, categories: { thing: 'cheese' }, }), '- category "thing" is not valid (must be an object with properties "appenders" and "level")' ); t.end(); }); batch.test('should give error if default category not defined', (t) => { t.throws( () => log4js.configure({ appenders: { stdout: { type: 'stdout' } }, categories: { thing: { appenders: ['stdout'], level: 'ERROR' } }, }), '- must define a "default" category.' ); t.end(); }); batch.test('should require at least one category', (t) => { t.throws( () => log4js.configure({ appenders: { stdout: { type: 'stdout' } }, categories: {}, }), '- must define at least one category.' ); t.end(); }); batch.test('should give error if category.appenders is not an array', (t) => { t.throws( () => log4js.configure({ appenders: { stdout: { type: 'stdout' } }, categories: { thing: { appenders: {}, level: 'ERROR' } }, }), '- category "thing" is not valid (appenders must be an array of appender names)' ); t.end(); }); batch.test('should give error if category.appenders is empty', (t) => { t.throws( () => log4js.configure({ appenders: { stdout: { type: 'stdout' } }, categories: { thing: { appenders: [], level: 'ERROR' } }, }), '- category "thing" is not valid (appenders must contain at least one appender name)' ); t.end(); }); batch.test( 'should give error if categories do not refer to valid appenders', (t) => { t.throws( () => log4js.configure({ appenders: { stdout: { type: 'stdout' } }, categories: { thing: { appenders: ['cheese'], level: 'ERROR' } }, }), '- category "thing" is not valid (appender "cheese" is not defined)' ); t.end(); } ); batch.test('should give error if category level is not valid', (t) => { t.throws( () => log4js.configure({ appenders: { stdout: { type: 'stdout' } }, categories: { default: { appenders: ['stdout'], level: 'Biscuits' } }, }), '- category "default" is not valid (level "Biscuits" not recognised; valid levels are ALL, TRACE' ); t.end(); }); batch.test( 'should give error if category enableCallStack is not valid', (t) => { t.throws( () => log4js.configure({ appenders: { stdout: { type: 'stdout' } }, categories: { default: { appenders: ['stdout'], level: 'Debug', enableCallStack: '123', }, }, }), '- category "default" is not valid (enableCallStack must be boolean type)' ); t.end(); } ); batch.test('should give error if appender type cannot be found', (t) => { t.throws( () => log4js.configure({ appenders: { thing: { type: 'cheese' } }, categories: { default: { appenders: ['thing'], level: 'ERROR' } }, }), '- appender "thing" is not valid (type "cheese" could not be found)' ); t.end(); }); batch.test('should create appender instances', (t) => { const thing = {}; const sandboxedLog4js = sandbox.require('../../lib/log4js', { requires: { cheese: testAppender('cheesy', thing), }, ignoreMissing: true, }); sandboxedLog4js.configure({ appenders: { thing: { type: 'cheese' } }, categories: { default: { appenders: ['thing'], level: 'ERROR' } }, }); t.ok(thing.configureCalled); t.equal(thing.type, 'cheese'); t.end(); }); batch.test( 'should use provided appender instance if instance provided', (t) => { const thing = {}; const cheese = testAppender('cheesy', thing); const sandboxedLog4js = sandbox.require('../../lib/log4js', { ignoreMissing: true, }); sandboxedLog4js.configure({ appenders: { thing: { type: cheese } }, categories: { default: { appenders: ['thing'], level: 'ERROR' } }, }); t.ok(thing.configureCalled); t.same(thing.type, cheese); t.end(); } ); batch.test('should not throw error if configure object is freezed', (t) => { const testFile = 'test/tap/freeze-date-file-test'; t.teardown(async () => { await removeFiles(testFile); }); t.doesNotThrow(() => log4js.configure( deepFreeze({ appenders: { dateFile: { type: 'dateFile', filename: testFile, alwaysIncludePattern: false, }, }, categories: { default: { appenders: ['dateFile'], level: log4js.levels.ERROR }, }, }) ) ); log4js.shutdown(() => { t.end(); }); }); batch.test('should load appenders from core first', (t) => { const result = {}; const sandboxedLog4js = sandbox.require('../../lib/log4js', { requires: { './cheese': testAppender('correct', result), cheese: testAppender('wrong', result), }, ignoreMissing: true, }); sandboxedLog4js.configure({ appenders: { thing: { type: 'cheese' } }, categories: { default: { appenders: ['thing'], level: 'ERROR' } }, }); t.ok(result.configureCalled); t.equal(result.type, 'cheese'); t.equal(result.label, 'correct'); t.end(); }); batch.test( 'should load appenders relative to main file if not in core, or node_modules', (t) => { const result = {}; const mainPath = path.dirname(require.main.filename); const sandboxConfig = { ignoreMissing: true, requires: {}, }; sandboxConfig.requires[`${mainPath}/cheese`] = testAppender( 'correct', result ); // add this one, because when we're running coverage the main path is a bit different sandboxConfig.requires[ `${path.join(mainPath, '../../node_modules/nyc/bin/cheese')}` ] = testAppender('correct', result); // in tap v15, the main path is at root of log4js (run `DEBUG=log4js:appenders npm test > /dev/null` to check) sandboxConfig.requires[`${path.join(mainPath, '../../cheese')}`] = testAppender('correct', result); // in node v6, there's an extra layer of node modules for some reason, so add this one to work around it sandboxConfig.requires[ `${path.join( mainPath, '../../node_modules/tap/node_modules/nyc/bin/cheese' )}` ] = testAppender('correct', result); const sandboxedLog4js = sandbox.require( '../../lib/log4js', sandboxConfig ); sandboxedLog4js.configure({ appenders: { thing: { type: 'cheese' } }, categories: { default: { appenders: ['thing'], level: 'ERROR' } }, }); t.ok(result.configureCalled); t.equal(result.type, 'cheese'); t.equal(result.label, 'correct'); t.end(); } ); batch.test( 'should load appenders relative to process.cwd if not found in core, node_modules', (t) => { const result = {}; const fakeProcess = new Proxy(process, { get(target, key) { if (key === 'cwd') { return () => '/var/lib/cheese'; } return target[key]; }, }); // windows file paths are different to unix, so let's make this work for both. const requires = {}; requires[path.join('/var', 'lib', 'cheese', 'cheese')] = testAppender( 'correct', result ); const sandboxedLog4js = sandbox.require('../../lib/log4js', { ignoreMissing: true, requires, globals: { process: fakeProcess, }, }); sandboxedLog4js.configure({ appenders: { thing: { type: 'cheese' } }, categories: { default: { appenders: ['thing'], level: 'ERROR' } }, }); t.ok(result.configureCalled); t.equal(result.type, 'cheese'); t.equal(result.label, 'correct'); t.end(); } ); batch.test('should pass config, layout, findAppender to appenders', (t) => { const result = {}; const sandboxedLog4js = sandbox.require('../../lib/log4js', { ignoreMissing: true, requires: { cheese: testAppender('cheesy', result), notCheese: testAppender('notCheesy', {}), }, }); sandboxedLog4js.configure({ appenders: { thing: { type: 'cheese', foo: 'bar' }, thing2: { type: 'notCheese' }, }, categories: { default: { appenders: ['thing'], level: 'ERROR' } }, }); t.ok(result.configureCalled); t.equal(result.type, 'cheese'); t.equal(result.config.foo, 'bar'); t.type(result.layouts, 'object'); t.type(result.layouts.basicLayout, 'function'); t.type(result.findAppender, 'function'); t.type(result.findAppender('thing2'), 'object'); t.end(); }); batch.test( 'should not give error if level object is used instead of string', (t) => { t.doesNotThrow(() => log4js.configure({ appenders: { thing: { type: 'stdout' } }, categories: { default: { appenders: ['thing'], level: log4js.levels.ERROR }, }, }) ); t.end(); } ); batch.test( 'should not create appender instance if not used in categories', (t) => { const used = {}; const notUsed = {}; const sandboxedLog4js = sandbox.require('../../lib/log4js', { requires: { cat: testAppender('meow', used), dog: testAppender('woof', notUsed), }, ignoreMissing: true, }); sandboxedLog4js.configure({ appenders: { used: { type: 'cat' }, notUsed: { type: 'dog' } }, categories: { default: { appenders: ['used'], level: 'ERROR' } }, }); t.ok(used.configureCalled); t.notOk(notUsed.configureCalled); t.end(); } ); batch.end(); });
const { test } = require('tap'); const util = require('util'); const path = require('path'); const sandbox = require('@log4js-node/sandboxed-module'); const debug = require('debug')('log4js:test.configuration-validation'); const deepFreeze = require('deep-freeze'); const fs = require('fs'); const log4js = require('../../lib/log4js'); const configuration = require('../../lib/configuration'); const removeFiles = async (filenames) => { if (!Array.isArray(filenames)) filenames = [filenames]; const promises = filenames.map((filename) => fs.promises.unlink(filename)); await Promise.allSettled(promises); }; const testAppender = (label, result) => ({ configure(config, layouts, findAppender) { debug( `testAppender(${label}).configure called, with config: ${util.inspect( config )}` ); result.configureCalled = true; result.type = config.type; result.label = label; result.config = config; result.layouts = layouts; result.findAppender = findAppender; return {}; }, }); test('log4js configuration validation', (batch) => { batch.test('should give error if config is just plain silly', (t) => { [null, undefined, '', ' ', []].forEach((config) => { const expectedError = new Error( `Problem with log4js configuration: (${util.inspect( config )}) - must be an object.` ); t.throws(() => configuration.configure(config), expectedError); }); t.end(); }); batch.test('should give error if config is an empty object', (t) => { t.throws( () => log4js.configure({}), '- must have a property "appenders" of type object.' ); t.end(); }); batch.test('should give error if config has no appenders', (t) => { t.throws( () => log4js.configure({ categories: {} }), '- must have a property "appenders" of type object.' ); t.end(); }); batch.test('should give error if config has no categories', (t) => { t.throws( () => log4js.configure({ appenders: { out: { type: 'stdout' } } }), '- must have a property "categories" of type object.' ); t.end(); }); batch.test('should give error if appenders is not an object', (t) => { t.throws( () => log4js.configure({ appenders: [], categories: [] }), '- must have a property "appenders" of type object.' ); t.end(); }); batch.test('should give error if appenders are not all valid', (t) => { t.throws( () => log4js.configure({ appenders: { thing: 'cheese' }, categories: {} }), '- appender "thing" is not valid (must be an object with property "type")' ); t.end(); }); batch.test('should require at least one appender', (t) => { t.throws( () => log4js.configure({ appenders: {}, categories: {} }), '- must define at least one appender.' ); t.end(); }); batch.test('should give error if categories are not all valid', (t) => { t.throws( () => log4js.configure({ appenders: { stdout: { type: 'stdout' } }, categories: { thing: 'cheese' }, }), '- category "thing" is not valid (must be an object with properties "appenders" and "level")' ); t.end(); }); batch.test('should give error if default category not defined', (t) => { t.throws( () => log4js.configure({ appenders: { stdout: { type: 'stdout' } }, categories: { thing: { appenders: ['stdout'], level: 'ERROR' } }, }), '- must define a "default" category.' ); t.end(); }); batch.test('should require at least one category', (t) => { t.throws( () => log4js.configure({ appenders: { stdout: { type: 'stdout' } }, categories: {}, }), '- must define at least one category.' ); t.end(); }); batch.test('should give error if category.appenders is not an array', (t) => { t.throws( () => log4js.configure({ appenders: { stdout: { type: 'stdout' } }, categories: { thing: { appenders: {}, level: 'ERROR' } }, }), '- category "thing" is not valid (appenders must be an array of appender names)' ); t.end(); }); batch.test('should give error if category.appenders is empty', (t) => { t.throws( () => log4js.configure({ appenders: { stdout: { type: 'stdout' } }, categories: { thing: { appenders: [], level: 'ERROR' } }, }), '- category "thing" is not valid (appenders must contain at least one appender name)' ); t.end(); }); batch.test( 'should give error if categories do not refer to valid appenders', (t) => { t.throws( () => log4js.configure({ appenders: { stdout: { type: 'stdout' } }, categories: { thing: { appenders: ['cheese'], level: 'ERROR' } }, }), '- category "thing" is not valid (appender "cheese" is not defined)' ); t.end(); } ); batch.test('should give error if category level is not valid', (t) => { t.throws( () => log4js.configure({ appenders: { stdout: { type: 'stdout' } }, categories: { default: { appenders: ['stdout'], level: 'Biscuits' } }, }), '- category "default" is not valid (level "Biscuits" not recognised; valid levels are ALL, TRACE' ); t.end(); }); batch.test( 'should give error if category enableCallStack is not valid', (t) => { t.throws( () => log4js.configure({ appenders: { stdout: { type: 'stdout' } }, categories: { default: { appenders: ['stdout'], level: 'Debug', enableCallStack: '123', }, }, }), '- category "default" is not valid (enableCallStack must be boolean type)' ); t.end(); } ); batch.test('should give error if appender type cannot be found', (t) => { t.throws( () => log4js.configure({ appenders: { thing: { type: 'cheese' } }, categories: { default: { appenders: ['thing'], level: 'ERROR' } }, }), '- appender "thing" is not valid (type "cheese" could not be found)' ); t.end(); }); batch.test('should create appender instances', (t) => { const thing = {}; const sandboxedLog4js = sandbox.require('../../lib/log4js', { requires: { cheese: testAppender('cheesy', thing), }, ignoreMissing: true, }); sandboxedLog4js.configure({ appenders: { thing: { type: 'cheese' } }, categories: { default: { appenders: ['thing'], level: 'ERROR' } }, }); t.ok(thing.configureCalled); t.equal(thing.type, 'cheese'); t.end(); }); batch.test( 'should use provided appender instance if instance provided', (t) => { const thing = {}; const cheese = testAppender('cheesy', thing); const sandboxedLog4js = sandbox.require('../../lib/log4js', { ignoreMissing: true, }); sandboxedLog4js.configure({ appenders: { thing: { type: cheese } }, categories: { default: { appenders: ['thing'], level: 'ERROR' } }, }); t.ok(thing.configureCalled); t.same(thing.type, cheese); t.end(); } ); batch.test('should not throw error if configure object is freezed', (t) => { const testFile = 'test/tap/freeze-date-file-test'; t.teardown(async () => { await removeFiles(testFile); }); t.doesNotThrow(() => log4js.configure( deepFreeze({ appenders: { dateFile: { type: 'dateFile', filename: testFile, alwaysIncludePattern: false, }, }, categories: { default: { appenders: ['dateFile'], level: log4js.levels.ERROR }, }, }) ) ); log4js.shutdown(() => { t.end(); }); }); batch.test('should load appenders from core first', (t) => { const result = {}; const sandboxedLog4js = sandbox.require('../../lib/log4js', { requires: { './cheese': testAppender('correct', result), cheese: testAppender('wrong', result), }, ignoreMissing: true, }); sandboxedLog4js.configure({ appenders: { thing: { type: 'cheese' } }, categories: { default: { appenders: ['thing'], level: 'ERROR' } }, }); t.ok(result.configureCalled); t.equal(result.type, 'cheese'); t.equal(result.label, 'correct'); t.end(); }); batch.test( 'should load appenders relative to main file if not in core, or node_modules', (t) => { const result = {}; const mainPath = path.dirname(require.main.filename); const sandboxConfig = { ignoreMissing: true, requires: {}, }; sandboxConfig.requires[`${mainPath}/cheese`] = testAppender( 'correct', result ); // add this one, because when we're running coverage the main path is a bit different sandboxConfig.requires[ `${path.join(mainPath, '../../node_modules/nyc/bin/cheese')}` ] = testAppender('correct', result); // in tap v15, the main path is at root of log4js (run `DEBUG=log4js:appenders npm test > /dev/null` to check) sandboxConfig.requires[`${path.join(mainPath, '../../cheese')}`] = testAppender('correct', result); // in node v6, there's an extra layer of node modules for some reason, so add this one to work around it sandboxConfig.requires[ `${path.join( mainPath, '../../node_modules/tap/node_modules/nyc/bin/cheese' )}` ] = testAppender('correct', result); const sandboxedLog4js = sandbox.require( '../../lib/log4js', sandboxConfig ); sandboxedLog4js.configure({ appenders: { thing: { type: 'cheese' } }, categories: { default: { appenders: ['thing'], level: 'ERROR' } }, }); t.ok(result.configureCalled); t.equal(result.type, 'cheese'); t.equal(result.label, 'correct'); t.end(); } ); batch.test( 'should load appenders relative to process.cwd if not found in core, node_modules', (t) => { const result = {}; const fakeProcess = new Proxy(process, { get(target, key) { if (key === 'cwd') { return () => '/var/lib/cheese'; } return target[key]; }, }); // windows file paths are different to unix, so let's make this work for both. const requires = {}; requires[path.join('/var', 'lib', 'cheese', 'cheese')] = testAppender( 'correct', result ); const sandboxedLog4js = sandbox.require('../../lib/log4js', { ignoreMissing: true, requires, globals: { process: fakeProcess, }, }); sandboxedLog4js.configure({ appenders: { thing: { type: 'cheese' } }, categories: { default: { appenders: ['thing'], level: 'ERROR' } }, }); t.ok(result.configureCalled); t.equal(result.type, 'cheese'); t.equal(result.label, 'correct'); t.end(); } ); batch.test('should pass config, layout, findAppender to appenders', (t) => { const result = {}; const sandboxedLog4js = sandbox.require('../../lib/log4js', { ignoreMissing: true, requires: { cheese: testAppender('cheesy', result), notCheese: testAppender('notCheesy', {}), }, }); sandboxedLog4js.configure({ appenders: { thing: { type: 'cheese', foo: 'bar' }, thing2: { type: 'notCheese' }, }, categories: { default: { appenders: ['thing'], level: 'ERROR' } }, }); t.ok(result.configureCalled); t.equal(result.type, 'cheese'); t.equal(result.config.foo, 'bar'); t.type(result.layouts, 'object'); t.type(result.layouts.basicLayout, 'function'); t.type(result.findAppender, 'function'); t.type(result.findAppender('thing2'), 'object'); t.end(); }); batch.test( 'should not give error if level object is used instead of string', (t) => { t.doesNotThrow(() => log4js.configure({ appenders: { thing: { type: 'stdout' } }, categories: { default: { appenders: ['thing'], level: log4js.levels.ERROR }, }, }) ); t.end(); } ); batch.test( 'should not create appender instance if not used in categories', (t) => { const used = {}; const notUsed = {}; const sandboxedLog4js = sandbox.require('../../lib/log4js', { requires: { cat: testAppender('meow', used), dog: testAppender('woof', notUsed), }, ignoreMissing: true, }); sandboxedLog4js.configure({ appenders: { used: { type: 'cat' }, notUsed: { type: 'dog' } }, categories: { default: { appenders: ['used'], level: 'ERROR' } }, }); t.ok(used.configureCalled); t.notOk(notUsed.configureCalled); t.end(); } ); batch.end(); });
-1
log4js-node/log4js-node
1,334
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately
lamweili
"2022-10-01T18:55:40Z"
"2022-10-01T18:57:33Z"
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
7ca308dfde78ffc3ece68b77e30107590c61dc12
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately.
./test/tap/dateFileAppender-test.js
/* eslint max-classes-per-file: ["error", 3] */ const { test } = require('tap'); const path = require('path'); const fs = require('fs'); const EOL = require('os').EOL || '\n'; const format = require('date-format'); const sandbox = require('@log4js-node/sandboxed-module'); const log4js = require('../../lib/log4js'); const osDelay = process.platform === 'win32' ? 400 : 200; function removeFile(filename) { try { fs.unlinkSync(path.join(__dirname, filename)); } catch (e) { // doesn't matter } } test('../../lib/appenders/dateFile', (batch) => { batch.test('with default settings', (t) => { const testFile = path.join(__dirname, 'date-appender-default.log'); log4js.configure({ appenders: { date: { type: 'dateFile', filename: testFile } }, categories: { default: { appenders: ['date'], level: 'DEBUG' } }, }); const logger = log4js.getLogger('default-settings'); logger.info('This should be in the file.'); t.teardown(() => { removeFile('date-appender-default.log'); }); setTimeout(() => { fs.readFile(testFile, 'utf8', (err, contents) => { t.match(contents, 'This should be in the file'); t.match( contents, /\[\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}\.\d{3}] \[INFO] default-settings - / ); t.end(); }); }, osDelay); }); batch.test('configure with dateFileAppender', (t) => { log4js.configure({ appenders: { date: { type: 'dateFile', filename: 'test/tap/date-file-test.log', pattern: '-yyyy-MM-dd', layout: { type: 'messagePassThrough' }, }, }, categories: { default: { appenders: ['date'], level: 'WARN' } }, }); const logger = log4js.getLogger('tests'); logger.info('this should not be written to the file'); logger.warn('this should be written to the file'); log4js.shutdown(() => { fs.readFile( path.join(__dirname, 'date-file-test.log'), 'utf8', (err, contents) => { t.match(contents, `this should be written to the file${EOL}`); t.equal( contents.indexOf('this should not be written to the file'), -1 ); t.end(); } ); }); t.teardown(() => { removeFile('date-file-test.log'); }); }); batch.test('configure with options.alwaysIncludePattern', (t) => { const options = { appenders: { date: { category: 'tests', type: 'dateFile', filename: 'test/tap/date-file-test', pattern: 'yyyy-MM-dd.log', alwaysIncludePattern: true, layout: { type: 'messagePassThrough', }, }, }, categories: { default: { appenders: ['date'], level: 'debug' } }, }; const thisTime = format.asString( options.appenders.date.pattern, new Date() ); const testFile = `date-file-test.${thisTime}`; const existingFile = path.join(__dirname, testFile); fs.writeFileSync(existingFile, `this is existing data${EOL}`, 'utf8'); log4js.configure(options); const logger = log4js.getLogger('tests'); logger.warn('this should be written to the file with the appended date'); t.teardown(() => { removeFile(testFile); }); // wait for filesystem to catch up log4js.shutdown(() => { fs.readFile(existingFile, 'utf8', (err, contents) => { t.match( contents, 'this is existing data', 'should not overwrite the file on open (issue #132)' ); t.match( contents, 'this should be written to the file with the appended date' ); t.end(); }); }); }); batch.test('should flush logs on shutdown', (t) => { const testFile = path.join(__dirname, 'date-appender-flush.log'); log4js.configure({ appenders: { test: { type: 'dateFile', filename: testFile } }, categories: { default: { appenders: ['test'], level: 'trace' } }, }); const logger = log4js.getLogger('default-settings'); logger.info('1'); logger.info('2'); logger.info('3'); t.teardown(() => { removeFile('date-appender-flush.log'); }); log4js.shutdown(() => { fs.readFile(testFile, 'utf8', (err, fileContents) => { // 3 lines of output, plus the trailing newline. t.equal(fileContents.split(EOL).length, 4); t.match( fileContents, /\[\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}\.\d{3}] \[INFO] default-settings - / ); t.end(); }); }); }); batch.test('should map maxLogSize to maxSize', (t) => { const fakeStreamroller = {}; class DateRollingFileStream { constructor(filename, pattern, options) { fakeStreamroller.filename = filename; fakeStreamroller.pattern = pattern; fakeStreamroller.options = options; } on() {} // eslint-disable-line class-methods-use-this } fakeStreamroller.DateRollingFileStream = DateRollingFileStream; const dateFileAppenderModule = sandbox.require( '../../lib/appenders/dateFile', { requires: { streamroller: fakeStreamroller }, } ); dateFileAppenderModule.configure( { filename: 'cheese.log', pattern: 'yyyy', maxLogSize: 100, }, { basicLayout: () => {} } ); t.equal(fakeStreamroller.options.maxSize, 100); t.end(); }); batch.test('handling of writer.writable', (t) => { const output = []; let writable = true; const DateRollingFileStream = class { write(loggingEvent) { output.push(loggingEvent); this.written = true; return true; } // eslint-disable-next-line class-methods-use-this on() {} // eslint-disable-next-line class-methods-use-this get writable() { return writable; } }; const dateFileAppender = sandbox.require('../../lib/appenders/dateFile', { requires: { streamroller: { DateRollingFileStream, }, }, }); const appender = dateFileAppender.configure( { filename: 'test1.log', maxLogSize: 100 }, { basicLayout(loggingEvent) { return loggingEvent.data; }, } ); t.test('should log when writer.writable=true', (assert) => { writable = true; appender({ data: 'something to log' }); assert.ok(output.length, 1); assert.match(output[output.length - 1], 'something to log'); assert.end(); }); t.test('should not log when writer.writable=false', (assert) => { writable = false; appender({ data: 'this should not be logged' }); assert.ok(output.length, 1); assert.notMatch(output[output.length - 1], 'this should not be logged'); assert.end(); }); t.end(); }); batch.test('when underlying stream errors', (t) => { let consoleArgs; let errorHandler; const DateRollingFileStream = class { end() { this.ended = true; } on(evt, cb) { if (evt === 'error') { this.errored = true; errorHandler = cb; } } write() { this.written = true; return true; } }; const dateFileAppender = sandbox.require('../../lib/appenders/dateFile', { globals: { console: { error(...args) { consoleArgs = args; }, }, }, requires: { streamroller: { DateRollingFileStream, }, }, }); dateFileAppender.configure( { filename: 'test1.log', maxLogSize: 100 }, { basicLayout() {} } ); errorHandler({ error: 'aargh' }); t.test('should log the error to console.error', (assert) => { assert.ok(consoleArgs); assert.equal( consoleArgs[0], 'log4js.dateFileAppender - Writing to file %s, error happened ' ); assert.equal(consoleArgs[1], 'test1.log'); assert.equal(consoleArgs[2].error, 'aargh'); assert.end(); }); t.end(); }); batch.end(); });
/* eslint max-classes-per-file: ["error", 3] */ const { test } = require('tap'); const path = require('path'); const fs = require('fs'); const EOL = require('os').EOL || '\n'; const format = require('date-format'); const sandbox = require('@log4js-node/sandboxed-module'); const log4js = require('../../lib/log4js'); const osDelay = process.platform === 'win32' ? 400 : 200; function removeFile(filename) { try { fs.unlinkSync(path.join(__dirname, filename)); } catch (e) { // doesn't matter } } test('../../lib/appenders/dateFile', (batch) => { batch.test('with default settings', (t) => { const testFile = path.join(__dirname, 'date-appender-default.log'); log4js.configure({ appenders: { date: { type: 'dateFile', filename: testFile } }, categories: { default: { appenders: ['date'], level: 'DEBUG' } }, }); const logger = log4js.getLogger('default-settings'); logger.info('This should be in the file.'); t.teardown(() => { removeFile('date-appender-default.log'); }); setTimeout(() => { fs.readFile(testFile, 'utf8', (err, contents) => { t.match(contents, 'This should be in the file'); t.match( contents, /\[\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}\.\d{3}] \[INFO] default-settings - / ); t.end(); }); }, osDelay); }); batch.test('configure with dateFileAppender', (t) => { log4js.configure({ appenders: { date: { type: 'dateFile', filename: 'test/tap/date-file-test.log', pattern: '-yyyy-MM-dd', layout: { type: 'messagePassThrough' }, }, }, categories: { default: { appenders: ['date'], level: 'WARN' } }, }); const logger = log4js.getLogger('tests'); logger.info('this should not be written to the file'); logger.warn('this should be written to the file'); log4js.shutdown(() => { fs.readFile( path.join(__dirname, 'date-file-test.log'), 'utf8', (err, contents) => { t.match(contents, `this should be written to the file${EOL}`); t.equal( contents.indexOf('this should not be written to the file'), -1 ); t.end(); } ); }); t.teardown(() => { removeFile('date-file-test.log'); }); }); batch.test('configure with options.alwaysIncludePattern', (t) => { const options = { appenders: { date: { category: 'tests', type: 'dateFile', filename: 'test/tap/date-file-test', pattern: 'yyyy-MM-dd.log', alwaysIncludePattern: true, layout: { type: 'messagePassThrough', }, }, }, categories: { default: { appenders: ['date'], level: 'debug' } }, }; const thisTime = format.asString( options.appenders.date.pattern, new Date() ); const testFile = `date-file-test.${thisTime}`; const existingFile = path.join(__dirname, testFile); fs.writeFileSync(existingFile, `this is existing data${EOL}`, 'utf8'); log4js.configure(options); const logger = log4js.getLogger('tests'); logger.warn('this should be written to the file with the appended date'); t.teardown(() => { removeFile(testFile); }); // wait for filesystem to catch up log4js.shutdown(() => { fs.readFile(existingFile, 'utf8', (err, contents) => { t.match( contents, 'this is existing data', 'should not overwrite the file on open (issue #132)' ); t.match( contents, 'this should be written to the file with the appended date' ); t.end(); }); }); }); batch.test('should flush logs on shutdown', (t) => { const testFile = path.join(__dirname, 'date-appender-flush.log'); log4js.configure({ appenders: { test: { type: 'dateFile', filename: testFile } }, categories: { default: { appenders: ['test'], level: 'trace' } }, }); const logger = log4js.getLogger('default-settings'); logger.info('1'); logger.info('2'); logger.info('3'); t.teardown(() => { removeFile('date-appender-flush.log'); }); log4js.shutdown(() => { fs.readFile(testFile, 'utf8', (err, fileContents) => { // 3 lines of output, plus the trailing newline. t.equal(fileContents.split(EOL).length, 4); t.match( fileContents, /\[\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}\.\d{3}] \[INFO] default-settings - / ); t.end(); }); }); }); batch.test('should map maxLogSize to maxSize', (t) => { const fakeStreamroller = {}; class DateRollingFileStream { constructor(filename, pattern, options) { fakeStreamroller.filename = filename; fakeStreamroller.pattern = pattern; fakeStreamroller.options = options; } on() {} // eslint-disable-line class-methods-use-this } fakeStreamroller.DateRollingFileStream = DateRollingFileStream; const dateFileAppenderModule = sandbox.require( '../../lib/appenders/dateFile', { requires: { streamroller: fakeStreamroller }, } ); dateFileAppenderModule.configure( { filename: 'cheese.log', pattern: 'yyyy', maxLogSize: 100, }, { basicLayout: () => {} } ); t.equal(fakeStreamroller.options.maxSize, 100); t.end(); }); batch.test('handling of writer.writable', (t) => { const output = []; let writable = true; const DateRollingFileStream = class { write(loggingEvent) { output.push(loggingEvent); this.written = true; return true; } // eslint-disable-next-line class-methods-use-this on() {} // eslint-disable-next-line class-methods-use-this get writable() { return writable; } }; const dateFileAppender = sandbox.require('../../lib/appenders/dateFile', { requires: { streamroller: { DateRollingFileStream, }, }, }); const appender = dateFileAppender.configure( { filename: 'test1.log', maxLogSize: 100 }, { basicLayout(loggingEvent) { return loggingEvent.data; }, } ); t.test('should log when writer.writable=true', (assert) => { writable = true; appender({ data: 'something to log' }); assert.ok(output.length, 1); assert.match(output[output.length - 1], 'something to log'); assert.end(); }); t.test('should not log when writer.writable=false', (assert) => { writable = false; appender({ data: 'this should not be logged' }); assert.ok(output.length, 1); assert.notMatch(output[output.length - 1], 'this should not be logged'); assert.end(); }); t.end(); }); batch.test('when underlying stream errors', (t) => { let consoleArgs; let errorHandler; const DateRollingFileStream = class { end() { this.ended = true; } on(evt, cb) { if (evt === 'error') { this.errored = true; errorHandler = cb; } } write() { this.written = true; return true; } }; const dateFileAppender = sandbox.require('../../lib/appenders/dateFile', { globals: { console: { error(...args) { consoleArgs = args; }, }, }, requires: { streamroller: { DateRollingFileStream, }, }, }); dateFileAppender.configure( { filename: 'test1.log', maxLogSize: 100 }, { basicLayout() {} } ); errorHandler({ error: 'aargh' }); t.test('should log the error to console.error', (assert) => { assert.ok(consoleArgs); assert.equal( consoleArgs[0], 'log4js.dateFileAppender - Writing to file %s, error happened ' ); assert.equal(consoleArgs[1], 'test1.log'); assert.equal(consoleArgs[2].error, 'aargh'); assert.end(); }); t.end(); }); batch.end(); });
-1
log4js-node/log4js-node
1,334
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately
lamweili
"2022-10-01T18:55:40Z"
"2022-10-01T18:57:33Z"
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
7ca308dfde78ffc3ece68b77e30107590c61dc12
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately.
./test/tap/no-cluster-test.js
const { test } = require('tap'); const proxyquire = require('proxyquire'); test('clustering is disabled if cluster is not present', (t) => { const log4js = proxyquire('../../lib/log4js', { cluster: null }); const recorder = require('../../lib/appenders/recording'); log4js.configure({ appenders: { vcr: { type: 'recording' } }, categories: { default: { appenders: ['vcr'], level: 'debug' } }, }); log4js.getLogger().info('it should still work'); const events = recorder.replay(); t.equal(events[0].data[0], 'it should still work'); t.end(); });
const { test } = require('tap'); const proxyquire = require('proxyquire'); test('clustering is disabled if cluster is not present', (t) => { const log4js = proxyquire('../../lib/log4js', { cluster: null }); const recorder = require('../../lib/appenders/recording'); log4js.configure({ appenders: { vcr: { type: 'recording' } }, categories: { default: { appenders: ['vcr'], level: 'debug' } }, }); log4js.getLogger().info('it should still work'); const events = recorder.replay(); t.equal(events[0].data[0], 'it should still work'); t.end(); });
-1
log4js-node/log4js-node
1,334
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately
lamweili
"2022-10-01T18:55:40Z"
"2022-10-01T18:57:33Z"
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
7ca308dfde78ffc3ece68b77e30107590c61dc12
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately.
./lib/appenders/categoryFilter.js
const debug = require('debug')('log4js:categoryFilter'); function categoryFilter(excludes, appender) { if (typeof excludes === 'string') excludes = [excludes]; return (logEvent) => { debug(`Checking ${logEvent.categoryName} against ${excludes}`); if (excludes.indexOf(logEvent.categoryName) === -1) { debug('Not excluded, sending to appender'); appender(logEvent); } }; } function configure(config, layouts, findAppender) { const appender = findAppender(config.appender); return categoryFilter(config.exclude, appender); } module.exports.configure = configure;
const debug = require('debug')('log4js:categoryFilter'); function categoryFilter(excludes, appender) { if (typeof excludes === 'string') excludes = [excludes]; return (logEvent) => { debug(`Checking ${logEvent.categoryName} against ${excludes}`); if (excludes.indexOf(logEvent.categoryName) === -1) { debug('Not excluded, sending to appender'); appender(logEvent); } }; } function configure(config, layouts, findAppender) { const appender = findAppender(config.appender); return categoryFilter(config.exclude, appender); } module.exports.configure = configure;
-1
log4js-node/log4js-node
1,334
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately
lamweili
"2022-10-01T18:55:40Z"
"2022-10-01T18:57:33Z"
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
7ca308dfde78ffc3ece68b77e30107590c61dc12
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately.
./examples/memory-test.js
const log4js = require('../lib/log4js'); log4js.configure({ appenders: { logs: { type: 'file', filename: 'memory-test.log', }, console: { type: 'stdout', }, file: { type: 'file', filename: 'memory-usage.log', layout: { type: 'messagePassThrough', }, }, }, categories: { default: { appenders: ['console'], level: 'info' }, 'memory-test': { appenders: ['logs'], level: 'info' }, 'memory-usage': { appenders: ['console', 'file'], level: 'info' }, }, }); const logger = log4js.getLogger('memory-test'); const usage = log4js.getLogger('memory-usage'); for (let i = 0; i < 1000000; i += 1) { if (i % 5000 === 0) { usage.info('%d %d', i, process.memoryUsage().rss); } logger.info('Doing something.'); } log4js.shutdown(() => {});
const log4js = require('../lib/log4js'); log4js.configure({ appenders: { logs: { type: 'file', filename: 'memory-test.log', }, console: { type: 'stdout', }, file: { type: 'file', filename: 'memory-usage.log', layout: { type: 'messagePassThrough', }, }, }, categories: { default: { appenders: ['console'], level: 'info' }, 'memory-test': { appenders: ['logs'], level: 'info' }, 'memory-usage': { appenders: ['console', 'file'], level: 'info' }, }, }); const logger = log4js.getLogger('memory-test'); const usage = log4js.getLogger('memory-usage'); for (let i = 0; i < 1000000; i += 1) { if (i % 5000 === 0) { usage.info('%d %d', i, process.memoryUsage().rss); } logger.info('Doing something.'); } log4js.shutdown(() => {});
-1
log4js-node/log4js-node
1,334
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately
lamweili
"2022-10-01T18:55:40Z"
"2022-10-01T18:57:33Z"
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
7ca308dfde78ffc3ece68b77e30107590c61dc12
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately.
./examples/pm2.json
{ "apps": [ { "name": "testing", "script": "pm2.js", "instances": 0, "instance_var": "INSTANCE_ID", "exec_mode": "cluster" } ] }
{ "apps": [ { "name": "testing", "script": "pm2.js", "instances": 0, "instance_var": "INSTANCE_ID", "exec_mode": "cluster" } ] }
-1
log4js-node/log4js-node
1,334
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately
lamweili
"2022-10-01T18:55:40Z"
"2022-10-01T18:57:33Z"
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
7ca308dfde78ffc3ece68b77e30107590c61dc12
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately.
./docs/stdout.md
# Standard Output Appender This appender writes all log events to the standard output stream. It is the default appender for log4js. # Configuration - `type` - `stdout` - `layout` - `object` (optional, defaults to colouredLayout) - see [layouts](layouts.md) # Example ```javascript log4js.configure({ appenders: { out: { type: "stdout" } }, categories: { default: { appenders: ["out"], level: "info" } }, }); ```
# Standard Output Appender This appender writes all log events to the standard output stream. It is the default appender for log4js. # Configuration - `type` - `stdout` - `layout` - `object` (optional, defaults to colouredLayout) - see [layouts](layouts.md) # Example ```javascript log4js.configure({ appenders: { out: { type: "stdout" } }, categories: { default: { appenders: ["out"], level: "info" } }, }); ```
-1
log4js-node/log4js-node
1,334
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately
lamweili
"2022-10-01T18:55:40Z"
"2022-10-01T18:57:33Z"
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
7ca308dfde78ffc3ece68b77e30107590c61dc12
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately.
./docs/console.md
# Console Appender This appender uses node's console object to write log events. It can also be used in the browser, if you're using browserify or something similar. Be aware that writing a high volume of output to the console can make your application use a lot of memory. If you experience this problem, try switching to the [stdout](stdout.md) appender. # Configuration - `type` - `console` - `layout` - `object` (optional, defaults to colouredLayout) - see [layouts](layouts.md) Note that all log events are output using `console.log` regardless of the event's level (so `ERROR` events will not be logged using `console.error`). # Example ```javascript log4js.configure({ appenders: { console: { type: "console" } }, categories: { default: { appenders: ["console"], level: "info" } }, }); ```
# Console Appender This appender uses node's console object to write log events. It can also be used in the browser, if you're using browserify or something similar. Be aware that writing a high volume of output to the console can make your application use a lot of memory. If you experience this problem, try switching to the [stdout](stdout.md) appender. # Configuration - `type` - `console` - `layout` - `object` (optional, defaults to colouredLayout) - see [layouts](layouts.md) Note that all log events are output using `console.log` regardless of the event's level (so `ERROR` events will not be logged using `console.error`). # Example ```javascript log4js.configure({ appenders: { console: { type: "console" } }, categories: { default: { appenders: ["console"], level: "info" } }, }); ```
-1
log4js-node/log4js-node
1,334
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately
lamweili
"2022-10-01T18:55:40Z"
"2022-10-01T18:57:33Z"
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
7ca308dfde78ffc3ece68b77e30107590c61dc12
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately.
./lib/LoggingEvent.js
/* eslint max-classes-per-file: ["error", 2] */ /* eslint no-underscore-dangle: ["error", { "allow": ["_getLocationKeys"] }] */ const flatted = require('flatted'); const levels = require('./levels'); class SerDe { constructor() { const deserialise = { __LOG4JS_undefined__: undefined, __LOG4JS_NaN__: Number('abc'), __LOG4JS_Infinity__: 1 / 0, '__LOG4JS_-Infinity__': -1 / 0, }; this.deMap = deserialise; this.serMap = {}; Object.keys(this.deMap).forEach((key) => { const value = this.deMap[key]; this.serMap[value] = key; }); } canSerialise(key) { if (typeof key === 'string') return false; return key in this.serMap; } serialise(key) { if (this.canSerialise(key)) return this.serMap[key]; return key; } canDeserialise(key) { return key in this.deMap; } deserialise(key) { if (this.canDeserialise(key)) return this.deMap[key]; return key; } } const serde = new SerDe(); /** * @name LoggingEvent * @namespace Log4js */ class LoggingEvent { /** * Models a logging event. * @constructor * @param {string} categoryName name of category * @param {Log4js.Level} level level of message * @param {Array} data objects to log * @param {Error} [error] * @author Seth Chisamore */ constructor(categoryName, level, data, context, location, error) { this.startTime = new Date(); this.categoryName = categoryName; this.data = data; this.level = level; this.context = Object.assign({}, context); // eslint-disable-line prefer-object-spread this.pid = process.pid; this.error = error; if (typeof location !== 'undefined') { if (!location || typeof location !== 'object' || Array.isArray(location)) throw new TypeError( 'Invalid location type passed to LoggingEvent constructor' ); this.constructor._getLocationKeys().forEach((key) => { if (typeof location[key] !== 'undefined') this[key] = location[key]; }); } } /** @private */ static _getLocationKeys() { return [ 'fileName', 'lineNumber', 'columnNumber', 'callStack', 'className', 'functionName', 'functionAlias', 'callerName', ]; } serialise() { return flatted.stringify(this, (key, value) => { // JSON.stringify(new Error('test')) returns {}, which is not really useful for us. // The following allows us to serialize errors (semi) correctly. if (value instanceof Error) { // eslint-disable-next-line prefer-object-spread value = Object.assign( { message: value.message, stack: value.stack }, value ); } // JSON.stringify({a: Number('abc'), b: 1/0, c: -1/0}) returns {a: null, b: null, c: null}. // The following allows us to serialize to NaN, Infinity and -Infinity correctly. // JSON.stringify([undefined]) returns [null]. // The following allows us to serialize to undefined correctly. return serde.serialise(value); }); } static deserialise(serialised) { let event; try { const rehydratedEvent = flatted.parse(serialised, (key, value) => { if (value && value.message && value.stack) { const fakeError = new Error(value); Object.keys(value).forEach((k) => { fakeError[k] = value[k]; }); value = fakeError; } return serde.deserialise(value); }); this._getLocationKeys().forEach((key) => { if (typeof rehydratedEvent[key] !== 'undefined') { if (!rehydratedEvent.location) rehydratedEvent.location = {}; rehydratedEvent.location[key] = rehydratedEvent[key]; } }); event = new LoggingEvent( rehydratedEvent.categoryName, levels.getLevel(rehydratedEvent.level.levelStr), rehydratedEvent.data, rehydratedEvent.context, rehydratedEvent.location, rehydratedEvent.error ); event.startTime = new Date(rehydratedEvent.startTime); event.pid = rehydratedEvent.pid; if (rehydratedEvent.cluster) { event.cluster = rehydratedEvent.cluster; } } catch (e) { event = new LoggingEvent('log4js', levels.ERROR, [ 'Unable to parse log:', serialised, 'because: ', e, ]); } return event; } } module.exports = LoggingEvent;
/* eslint max-classes-per-file: ["error", 2] */ /* eslint no-underscore-dangle: ["error", { "allow": ["_getLocationKeys"] }] */ const flatted = require('flatted'); const levels = require('./levels'); class SerDe { constructor() { const deserialise = { __LOG4JS_undefined__: undefined, __LOG4JS_NaN__: Number('abc'), __LOG4JS_Infinity__: 1 / 0, '__LOG4JS_-Infinity__': -1 / 0, }; this.deMap = deserialise; this.serMap = {}; Object.keys(this.deMap).forEach((key) => { const value = this.deMap[key]; this.serMap[value] = key; }); } canSerialise(key) { if (typeof key === 'string') return false; return key in this.serMap; } serialise(key) { if (this.canSerialise(key)) return this.serMap[key]; return key; } canDeserialise(key) { return key in this.deMap; } deserialise(key) { if (this.canDeserialise(key)) return this.deMap[key]; return key; } } const serde = new SerDe(); /** * @name LoggingEvent * @namespace Log4js */ class LoggingEvent { /** * Models a logging event. * @constructor * @param {string} categoryName name of category * @param {Log4js.Level} level level of message * @param {Array} data objects to log * @param {Error} [error] * @author Seth Chisamore */ constructor(categoryName, level, data, context, location, error) { this.startTime = new Date(); this.categoryName = categoryName; this.data = data; this.level = level; this.context = Object.assign({}, context); // eslint-disable-line prefer-object-spread this.pid = process.pid; this.error = error; if (typeof location !== 'undefined') { if (!location || typeof location !== 'object' || Array.isArray(location)) throw new TypeError( 'Invalid location type passed to LoggingEvent constructor' ); this.constructor._getLocationKeys().forEach((key) => { if (typeof location[key] !== 'undefined') this[key] = location[key]; }); } } /** @private */ static _getLocationKeys() { return [ 'fileName', 'lineNumber', 'columnNumber', 'callStack', 'className', 'functionName', 'functionAlias', 'callerName', ]; } serialise() { return flatted.stringify(this, (key, value) => { // JSON.stringify(new Error('test')) returns {}, which is not really useful for us. // The following allows us to serialize errors (semi) correctly. if (value instanceof Error) { // eslint-disable-next-line prefer-object-spread value = Object.assign( { message: value.message, stack: value.stack }, value ); } // JSON.stringify({a: Number('abc'), b: 1/0, c: -1/0}) returns {a: null, b: null, c: null}. // The following allows us to serialize to NaN, Infinity and -Infinity correctly. // JSON.stringify([undefined]) returns [null]. // The following allows us to serialize to undefined correctly. return serde.serialise(value); }); } static deserialise(serialised) { let event; try { const rehydratedEvent = flatted.parse(serialised, (key, value) => { if (value && value.message && value.stack) { const fakeError = new Error(value); Object.keys(value).forEach((k) => { fakeError[k] = value[k]; }); value = fakeError; } return serde.deserialise(value); }); this._getLocationKeys().forEach((key) => { if (typeof rehydratedEvent[key] !== 'undefined') { if (!rehydratedEvent.location) rehydratedEvent.location = {}; rehydratedEvent.location[key] = rehydratedEvent[key]; } }); event = new LoggingEvent( rehydratedEvent.categoryName, levels.getLevel(rehydratedEvent.level.levelStr), rehydratedEvent.data, rehydratedEvent.context, rehydratedEvent.location, rehydratedEvent.error ); event.startTime = new Date(rehydratedEvent.startTime); event.pid = rehydratedEvent.pid; if (rehydratedEvent.cluster) { event.cluster = rehydratedEvent.cluster; } } catch (e) { event = new LoggingEvent('log4js', levels.ERROR, [ 'Unable to parse log:', serialised, 'because: ', e, ]); } return event; } } module.exports = LoggingEvent;
-1
log4js-node/log4js-node
1,334
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately
lamweili
"2022-10-01T18:55:40Z"
"2022-10-01T18:57:33Z"
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
7ca308dfde78ffc3ece68b77e30107590c61dc12
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately.
./examples/smtp-appender.js
// Note that smtp appender needs nodemailer to work. // If you haven't got nodemailer installed, you'll get cryptic // "cannot find module" errors when using the smtp appender const log4js = require('../lib/log4js'); log4js.configure({ appenders: { out: { type: 'console', }, mail: { type: '@log4js-node/smtp', recipients: 'logfilerecipient@logging.com', sendInterval: 5, transport: 'SMTP', SMTP: { host: 'smtp.gmail.com', secureConnection: true, port: 465, auth: { user: 'someone@gmail', pass: '********************', }, debug: true, }, }, }, categories: { default: { appenders: ['out'], level: 'info' }, mailer: { appenders: ['mail'], level: 'info' }, }, }); const log = log4js.getLogger('test'); const logmailer = log4js.getLogger('mailer'); function doTheLogging(x) { log.info('Logging something %d', x); logmailer.info('Logging something %d', x); } for (let i = 0; i < 500; i += 1) { doTheLogging(i); }
// Note that smtp appender needs nodemailer to work. // If you haven't got nodemailer installed, you'll get cryptic // "cannot find module" errors when using the smtp appender const log4js = require('../lib/log4js'); log4js.configure({ appenders: { out: { type: 'console', }, mail: { type: '@log4js-node/smtp', recipients: 'logfilerecipient@logging.com', sendInterval: 5, transport: 'SMTP', SMTP: { host: 'smtp.gmail.com', secureConnection: true, port: 465, auth: { user: 'someone@gmail', pass: '********************', }, debug: true, }, }, }, categories: { default: { appenders: ['out'], level: 'info' }, mailer: { appenders: ['mail'], level: 'info' }, }, }); const log = log4js.getLogger('test'); const logmailer = log4js.getLogger('mailer'); function doTheLogging(x) { log.info('Logging something %d', x); logmailer.info('Logging something %d', x); } for (let i = 0; i < 500; i += 1) { doTheLogging(i); }
-1
log4js-node/log4js-node
1,334
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately
lamweili
"2022-10-01T18:55:40Z"
"2022-10-01T18:57:33Z"
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
7ca308dfde78ffc3ece68b77e30107590c61dc12
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately.
./test/tap/appender-dependencies-test.js
const { test } = require('tap'); const categories = { default: { appenders: ['filtered'], level: 'debug' }, }; let log4js; let recording; test('log4js appender dependencies', (batch) => { batch.beforeEach((done) => { log4js = require('../../lib/log4js'); recording = require('../../lib/appenders/recording'); if (typeof done === 'function') { done(); } }); batch.afterEach((done) => { recording.erase(); if (typeof done === 'function') { done(); } }); batch.test('in order', (t) => { const config = { categories, appenders: { recorder: { type: 'recording' }, filtered: { type: 'logLevelFilter', appender: 'recorder', level: 'ERROR', }, }, }; t.test('should resolve if defined in dependency order', (assert) => { assert.doesNotThrow(() => { log4js.configure(config); }, 'this should not trigger an error'); assert.end(); }); const logger = log4js.getLogger('logLevelTest'); logger.debug('this should not trigger an event'); logger.error('this should, though'); const logEvents = recording.replay(); t.test('should process log events normally', (assert) => { assert.equal(logEvents.length, 1); assert.equal(logEvents[0].data[0], 'this should, though'); assert.end(); }); t.end(); }); batch.test('not in order', (t) => { const config = { categories, appenders: { filtered: { type: 'logLevelFilter', appender: 'recorder', level: 'ERROR', }, recorder: { type: 'recording' }, }, }; t.test('should resolve if defined out of dependency order', (assert) => { assert.doesNotThrow(() => { log4js.configure(config); }, 'this should not trigger an error'); assert.end(); }); const logger = log4js.getLogger('logLevelTest'); logger.debug('this should not trigger an event'); logger.error('this should, though'); const logEvents = recording.replay(); t.test('should process log events normally', (assert) => { assert.equal(logEvents.length, 1); assert.equal(logEvents[0].data[0], 'this should, though'); assert.end(); }); t.end(); }); batch.test('with dependency loop', (t) => { const config = { categories, appenders: { filtered: { type: 'logLevelFilter', appender: 'filtered2', level: 'ERROR', }, filtered2: { type: 'logLevelFilter', appender: 'filtered', level: 'ERROR', }, recorder: { type: 'recording' }, }, }; t.test( 'should throw an error if if a dependency loop is found', (assert) => { assert.throws(() => { log4js.configure(config); }, 'Dependency loop detected for appender filtered.'); assert.end(); } ); t.end(); }); batch.end(); });
const { test } = require('tap'); const categories = { default: { appenders: ['filtered'], level: 'debug' }, }; let log4js; let recording; test('log4js appender dependencies', (batch) => { batch.beforeEach((done) => { log4js = require('../../lib/log4js'); recording = require('../../lib/appenders/recording'); if (typeof done === 'function') { done(); } }); batch.afterEach((done) => { recording.erase(); if (typeof done === 'function') { done(); } }); batch.test('in order', (t) => { const config = { categories, appenders: { recorder: { type: 'recording' }, filtered: { type: 'logLevelFilter', appender: 'recorder', level: 'ERROR', }, }, }; t.test('should resolve if defined in dependency order', (assert) => { assert.doesNotThrow(() => { log4js.configure(config); }, 'this should not trigger an error'); assert.end(); }); const logger = log4js.getLogger('logLevelTest'); logger.debug('this should not trigger an event'); logger.error('this should, though'); const logEvents = recording.replay(); t.test('should process log events normally', (assert) => { assert.equal(logEvents.length, 1); assert.equal(logEvents[0].data[0], 'this should, though'); assert.end(); }); t.end(); }); batch.test('not in order', (t) => { const config = { categories, appenders: { filtered: { type: 'logLevelFilter', appender: 'recorder', level: 'ERROR', }, recorder: { type: 'recording' }, }, }; t.test('should resolve if defined out of dependency order', (assert) => { assert.doesNotThrow(() => { log4js.configure(config); }, 'this should not trigger an error'); assert.end(); }); const logger = log4js.getLogger('logLevelTest'); logger.debug('this should not trigger an event'); logger.error('this should, though'); const logEvents = recording.replay(); t.test('should process log events normally', (assert) => { assert.equal(logEvents.length, 1); assert.equal(logEvents[0].data[0], 'this should, though'); assert.end(); }); t.end(); }); batch.test('with dependency loop', (t) => { const config = { categories, appenders: { filtered: { type: 'logLevelFilter', appender: 'filtered2', level: 'ERROR', }, filtered2: { type: 'logLevelFilter', appender: 'filtered', level: 'ERROR', }, recorder: { type: 'recording' }, }, }; t.test( 'should throw an error if if a dependency loop is found', (assert) => { assert.throws(() => { log4js.configure(config); }, 'Dependency loop detected for appender filtered.'); assert.end(); } ); t.end(); }); batch.end(); });
-1
log4js-node/log4js-node
1,334
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately
lamweili
"2022-10-01T18:55:40Z"
"2022-10-01T18:57:33Z"
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
7ca308dfde78ffc3ece68b77e30107590c61dc12
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately.
./SECURITY.md
# Security Policy ## Supported Versions We're aiming to only support the latest major version of log4js. Older than that is usually _very_ old. | Version | Supported | | ------- | ------------------ | | 6.x | :white_check_mark: | | < 6.0 | :x: | ## Reporting a Vulnerability Report vulnerabilities via email to: - Gareth Jones <gareth.nomiddlename@gmail.com> - Lam Wei Li <lam_wei_li@hotmail.com> Please put "[log4js:security]" in the subject line. We will aim to respond within a day or two.
# Security Policy ## Supported Versions We're aiming to only support the latest major version of log4js. Older than that is usually _very_ old. | Version | Supported | | ------- | ------------------ | | 6.x | :white_check_mark: | | < 6.0 | :x: | ## Reporting a Vulnerability Report vulnerabilities via email to: - Gareth Jones <gareth.nomiddlename@gmail.com> - Lam Wei Li <lam_wei_li@hotmail.com> Please put "[log4js:security]" in the subject line. We will aim to respond within a day or two.
-1
log4js-node/log4js-node
1,334
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately
lamweili
"2022-10-01T18:55:40Z"
"2022-10-01T18:57:33Z"
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
7ca308dfde78ffc3ece68b77e30107590c61dc12
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately.
./lib/appenders/logLevelFilter.js
function logLevelFilter(minLevelString, maxLevelString, appender, levels) { const minLevel = levels.getLevel(minLevelString); const maxLevel = levels.getLevel(maxLevelString, levels.FATAL); return (logEvent) => { const eventLevel = logEvent.level; if ( minLevel.isLessThanOrEqualTo(eventLevel) && maxLevel.isGreaterThanOrEqualTo(eventLevel) ) { appender(logEvent); } }; } function configure(config, layouts, findAppender, levels) { const appender = findAppender(config.appender); return logLevelFilter(config.level, config.maxLevel, appender, levels); } module.exports.configure = configure;
function logLevelFilter(minLevelString, maxLevelString, appender, levels) { const minLevel = levels.getLevel(minLevelString); const maxLevel = levels.getLevel(maxLevelString, levels.FATAL); return (logEvent) => { const eventLevel = logEvent.level; if ( minLevel.isLessThanOrEqualTo(eventLevel) && maxLevel.isGreaterThanOrEqualTo(eventLevel) ) { appender(logEvent); } }; } function configure(config, layouts, findAppender, levels) { const appender = findAppender(config.appender); return logLevelFilter(config.level, config.maxLevel, appender, levels); } module.exports.configure = configure;
-1
log4js-node/log4js-node
1,334
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately
lamweili
"2022-10-01T18:55:40Z"
"2022-10-01T18:57:33Z"
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
7ca308dfde78ffc3ece68b77e30107590c61dc12
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately.
./lib/appenders/tcp-server.js
const debug = require('debug')('log4js:tcp-server'); const net = require('net'); const clustering = require('../clustering'); const LoggingEvent = require('../LoggingEvent'); const DELIMITER = '__LOG4JS__'; exports.configure = (config) => { debug('configure called with ', config); const server = net.createServer((socket) => { let dataSoFar = ''; const send = (data) => { if (data) { dataSoFar += data; if (dataSoFar.indexOf(DELIMITER)) { const events = dataSoFar.split(DELIMITER); if (!dataSoFar.endsWith(DELIMITER)) { dataSoFar = events.pop(); } else { dataSoFar = ''; } events .filter((e) => e.length) .forEach((e) => { clustering.send(LoggingEvent.deserialise(e)); }); } else { dataSoFar = ''; } } }; socket.setEncoding('utf8'); socket.on('data', send); socket.on('end', send); }); server.listen(config.port || 5000, config.host || 'localhost', () => { debug(`listening on ${config.host || 'localhost'}:${config.port || 5000}`); server.unref(); }); return { shutdown: (cb) => { debug('shutdown called.'); server.close(cb); }, }; };
const debug = require('debug')('log4js:tcp-server'); const net = require('net'); const clustering = require('../clustering'); const LoggingEvent = require('../LoggingEvent'); const DELIMITER = '__LOG4JS__'; exports.configure = (config) => { debug('configure called with ', config); const server = net.createServer((socket) => { let dataSoFar = ''; const send = (data) => { if (data) { dataSoFar += data; if (dataSoFar.indexOf(DELIMITER)) { const events = dataSoFar.split(DELIMITER); if (!dataSoFar.endsWith(DELIMITER)) { dataSoFar = events.pop(); } else { dataSoFar = ''; } events .filter((e) => e.length) .forEach((e) => { clustering.send(LoggingEvent.deserialise(e)); }); } else { dataSoFar = ''; } } }; socket.setEncoding('utf8'); socket.on('data', send); socket.on('end', send); }); server.listen(config.port || 5000, config.host || 'localhost', () => { debug(`listening on ${config.host || 'localhost'}:${config.port || 5000}`); server.unref(); }); return { shutdown: (cb) => { debug('shutdown called.'); server.close(cb); }, }; };
-1
log4js-node/log4js-node
1,334
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately
lamweili
"2022-10-01T18:55:40Z"
"2022-10-01T18:57:33Z"
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
7ca308dfde78ffc3ece68b77e30107590c61dc12
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately.
./lib/appenders/adapters.js
function maxFileSizeUnitTransform(maxLogSize) { if (typeof maxLogSize === 'number' && Number.isInteger(maxLogSize)) { return maxLogSize; } const units = { K: 1024, M: 1024 * 1024, G: 1024 * 1024 * 1024, }; const validUnit = Object.keys(units); const unit = maxLogSize.slice(-1).toLocaleUpperCase(); const value = maxLogSize.slice(0, -1).trim(); if (validUnit.indexOf(unit) < 0 || !Number.isInteger(Number(value))) { throw Error(`maxLogSize: "${maxLogSize}" is invalid`); } else { return value * units[unit]; } } function adapter(configAdapter, config) { const newConfig = Object.assign({}, config); // eslint-disable-line prefer-object-spread Object.keys(configAdapter).forEach((key) => { if (newConfig[key]) { newConfig[key] = configAdapter[key](config[key]); } }); return newConfig; } function fileAppenderAdapter(config) { const configAdapter = { maxLogSize: maxFileSizeUnitTransform, }; return adapter(configAdapter, config); } const adapters = { dateFile: fileAppenderAdapter, file: fileAppenderAdapter, fileSync: fileAppenderAdapter, }; module.exports.modifyConfig = (config) => adapters[config.type] ? adapters[config.type](config) : config;
function maxFileSizeUnitTransform(maxLogSize) { if (typeof maxLogSize === 'number' && Number.isInteger(maxLogSize)) { return maxLogSize; } const units = { K: 1024, M: 1024 * 1024, G: 1024 * 1024 * 1024, }; const validUnit = Object.keys(units); const unit = maxLogSize.slice(-1).toLocaleUpperCase(); const value = maxLogSize.slice(0, -1).trim(); if (validUnit.indexOf(unit) < 0 || !Number.isInteger(Number(value))) { throw Error(`maxLogSize: "${maxLogSize}" is invalid`); } else { return value * units[unit]; } } function adapter(configAdapter, config) { const newConfig = Object.assign({}, config); // eslint-disable-line prefer-object-spread Object.keys(configAdapter).forEach((key) => { if (newConfig[key]) { newConfig[key] = configAdapter[key](config[key]); } }); return newConfig; } function fileAppenderAdapter(config) { const configAdapter = { maxLogSize: maxFileSizeUnitTransform, }; return adapter(configAdapter, config); } const adapters = { dateFile: fileAppenderAdapter, file: fileAppenderAdapter, fileSync: fileAppenderAdapter, }; module.exports.modifyConfig = (config) => adapters[config.type] ? adapters[config.type](config) : config;
-1
log4js-node/log4js-node
1,334
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately
lamweili
"2022-10-01T18:55:40Z"
"2022-10-01T18:57:33Z"
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
7ca308dfde78ffc3ece68b77e30107590c61dc12
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately.
./lib/appenders/noLogFilter.js
const debug = require('debug')('log4js:noLogFilter'); /** * The function removes empty or null regexp from the array * @param {string[]} regexp * @returns {string[]} a filtered string array with not empty or null regexp */ function removeNullOrEmptyRegexp(regexp) { const filtered = regexp.filter((el) => el != null && el !== ''); return filtered; } /** * Returns a function that will exclude the events in case they match * with the regular expressions provided * @param {(string|string[])} filters contains the regexp that will be used for the evaluation * @param {*} appender * @returns {function} */ function noLogFilter(filters, appender) { return (logEvent) => { debug(`Checking data: ${logEvent.data} against filters: ${filters}`); if (typeof filters === 'string') { filters = [filters]; } filters = removeNullOrEmptyRegexp(filters); const regex = new RegExp(filters.join('|'), 'i'); if ( filters.length === 0 || logEvent.data.findIndex((value) => regex.test(value)) < 0 ) { debug('Not excluded, sending to appender'); appender(logEvent); } }; } function configure(config, layouts, findAppender) { const appender = findAppender(config.appender); return noLogFilter(config.exclude, appender); } module.exports.configure = configure;
const debug = require('debug')('log4js:noLogFilter'); /** * The function removes empty or null regexp from the array * @param {string[]} regexp * @returns {string[]} a filtered string array with not empty or null regexp */ function removeNullOrEmptyRegexp(regexp) { const filtered = regexp.filter((el) => el != null && el !== ''); return filtered; } /** * Returns a function that will exclude the events in case they match * with the regular expressions provided * @param {(string|string[])} filters contains the regexp that will be used for the evaluation * @param {*} appender * @returns {function} */ function noLogFilter(filters, appender) { return (logEvent) => { debug(`Checking data: ${logEvent.data} against filters: ${filters}`); if (typeof filters === 'string') { filters = [filters]; } filters = removeNullOrEmptyRegexp(filters); const regex = new RegExp(filters.join('|'), 'i'); if ( filters.length === 0 || logEvent.data.findIndex((value) => regex.test(value)) < 0 ) { debug('Not excluded, sending to appender'); appender(logEvent); } }; } function configure(config, layouts, findAppender) { const appender = findAppender(config.appender); return noLogFilter(config.exclude, appender); } module.exports.configure = configure;
-1
log4js-node/log4js-node
1,334
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately
lamweili
"2022-10-01T18:55:40Z"
"2022-10-01T18:57:33Z"
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
7ca308dfde78ffc3ece68b77e30107590c61dc12
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately.
./lib/categories.js
const debug = require('debug')('log4js:categories'); const configuration = require('./configuration'); const levels = require('./levels'); const appenders = require('./appenders'); const categories = new Map(); /** * Add inherited config to this category. That includes extra appenders from parent, * and level, if none is set on this category. * This is recursive, so each parent also gets loaded with inherited appenders. * Inheritance is blocked if a category has inherit=false * @param {*} config * @param {*} category the child category * @param {string} categoryName dotted path to category * @return {void} */ function inheritFromParent(config, category, categoryName) { if (category.inherit === false) return; const lastDotIndex = categoryName.lastIndexOf('.'); if (lastDotIndex < 0) return; // category is not a child const parentCategoryName = categoryName.slice(0, lastDotIndex); let parentCategory = config.categories[parentCategoryName]; if (!parentCategory) { // parent is missing, so implicitly create it, so that it can inherit from its parents parentCategory = { inherit: true, appenders: [] }; } // make sure parent has had its inheritance taken care of before pulling its properties to this child inheritFromParent(config, parentCategory, parentCategoryName); // if the parent is not in the config (because we just created it above), // and it inherited a valid configuration, add it to config.categories if ( !config.categories[parentCategoryName] && parentCategory.appenders && parentCategory.appenders.length && parentCategory.level ) { config.categories[parentCategoryName] = parentCategory; } category.appenders = category.appenders || []; category.level = category.level || parentCategory.level; // merge in appenders from parent (parent is already holding its inherited appenders) parentCategory.appenders.forEach((ap) => { if (!category.appenders.includes(ap)) { category.appenders.push(ap); } }); category.parent = parentCategory; } /** * Walk all categories in the config, and pull down any configuration from parent to child. * This includes inherited appenders, and level, where level is not set. * Inheritance is skipped where a category has inherit=false. * @param {*} config */ function addCategoryInheritance(config) { if (!config.categories) return; const categoryNames = Object.keys(config.categories); categoryNames.forEach((name) => { const category = config.categories[name]; // add inherited appenders and level to this category inheritFromParent(config, category, name); }); } configuration.addPreProcessingListener((config) => addCategoryInheritance(config) ); configuration.addListener((config) => { configuration.throwExceptionIf( config, configuration.not(configuration.anObject(config.categories)), 'must have a property "categories" of type object.' ); const categoryNames = Object.keys(config.categories); configuration.throwExceptionIf( config, configuration.not(categoryNames.length), 'must define at least one category.' ); categoryNames.forEach((name) => { const category = config.categories[name]; configuration.throwExceptionIf( config, [ configuration.not(category.appenders), configuration.not(category.level), ], `category "${name}" is not valid (must be an object with properties "appenders" and "level")` ); configuration.throwExceptionIf( config, configuration.not(Array.isArray(category.appenders)), `category "${name}" is not valid (appenders must be an array of appender names)` ); configuration.throwExceptionIf( config, configuration.not(category.appenders.length), `category "${name}" is not valid (appenders must contain at least one appender name)` ); if (Object.prototype.hasOwnProperty.call(category, 'enableCallStack')) { configuration.throwExceptionIf( config, typeof category.enableCallStack !== 'boolean', `category "${name}" is not valid (enableCallStack must be boolean type)` ); } category.appenders.forEach((appender) => { configuration.throwExceptionIf( config, configuration.not(appenders.get(appender)), `category "${name}" is not valid (appender "${appender}" is not defined)` ); }); configuration.throwExceptionIf( config, configuration.not(levels.getLevel(category.level)), `category "${name}" is not valid (level "${category.level}" not recognised;` + ` valid levels are ${levels.levels.join(', ')})` ); }); configuration.throwExceptionIf( config, configuration.not(config.categories.default), 'must define a "default" category.' ); }); const setup = (config) => { categories.clear(); if (!config) { return; } const categoryNames = Object.keys(config.categories); categoryNames.forEach((name) => { const category = config.categories[name]; const categoryAppenders = []; category.appenders.forEach((appender) => { categoryAppenders.push(appenders.get(appender)); debug(`Creating category ${name}`); categories.set(name, { appenders: categoryAppenders, level: levels.getLevel(category.level), enableCallStack: category.enableCallStack || false, }); }); }); }; const init = () => { setup(); }; init(); configuration.addListener(setup); const configForCategory = (category) => { debug(`configForCategory: searching for config for ${category}`); if (categories.has(category)) { debug(`configForCategory: ${category} exists in config, returning it`); return categories.get(category); } let sourceCategoryConfig; if (category.indexOf('.') > 0) { debug(`configForCategory: ${category} has hierarchy, cloning from parents`); sourceCategoryConfig = { ...configForCategory(category.slice(0, category.lastIndexOf('.'))), }; } else { if (!categories.has('default')) { setup({ categories: { default: { appenders: ['out'], level: 'OFF' } } }); } debug('configForCategory: cloning default category'); sourceCategoryConfig = { ...categories.get('default') }; } categories.set(category, sourceCategoryConfig); return sourceCategoryConfig; }; const appendersForCategory = (category) => configForCategory(category).appenders; const getLevelForCategory = (category) => configForCategory(category).level; const setLevelForCategory = (category, level) => { configForCategory(category).level = level; }; const getEnableCallStackForCategory = (category) => configForCategory(category).enableCallStack === true; const setEnableCallStackForCategory = (category, useCallStack) => { configForCategory(category).enableCallStack = useCallStack; }; module.exports = categories; module.exports = Object.assign(module.exports, { appendersForCategory, getLevelForCategory, setLevelForCategory, getEnableCallStackForCategory, setEnableCallStackForCategory, init, });
const debug = require('debug')('log4js:categories'); const configuration = require('./configuration'); const levels = require('./levels'); const appenders = require('./appenders'); const categories = new Map(); /** * Add inherited config to this category. That includes extra appenders from parent, * and level, if none is set on this category. * This is recursive, so each parent also gets loaded with inherited appenders. * Inheritance is blocked if a category has inherit=false * @param {*} config * @param {*} category the child category * @param {string} categoryName dotted path to category * @return {void} */ function inheritFromParent(config, category, categoryName) { if (category.inherit === false) return; const lastDotIndex = categoryName.lastIndexOf('.'); if (lastDotIndex < 0) return; // category is not a child const parentCategoryName = categoryName.slice(0, lastDotIndex); let parentCategory = config.categories[parentCategoryName]; if (!parentCategory) { // parent is missing, so implicitly create it, so that it can inherit from its parents parentCategory = { inherit: true, appenders: [] }; } // make sure parent has had its inheritance taken care of before pulling its properties to this child inheritFromParent(config, parentCategory, parentCategoryName); // if the parent is not in the config (because we just created it above), // and it inherited a valid configuration, add it to config.categories if ( !config.categories[parentCategoryName] && parentCategory.appenders && parentCategory.appenders.length && parentCategory.level ) { config.categories[parentCategoryName] = parentCategory; } category.appenders = category.appenders || []; category.level = category.level || parentCategory.level; // merge in appenders from parent (parent is already holding its inherited appenders) parentCategory.appenders.forEach((ap) => { if (!category.appenders.includes(ap)) { category.appenders.push(ap); } }); category.parent = parentCategory; } /** * Walk all categories in the config, and pull down any configuration from parent to child. * This includes inherited appenders, and level, where level is not set. * Inheritance is skipped where a category has inherit=false. * @param {*} config */ function addCategoryInheritance(config) { if (!config.categories) return; const categoryNames = Object.keys(config.categories); categoryNames.forEach((name) => { const category = config.categories[name]; // add inherited appenders and level to this category inheritFromParent(config, category, name); }); } configuration.addPreProcessingListener((config) => addCategoryInheritance(config) ); configuration.addListener((config) => { configuration.throwExceptionIf( config, configuration.not(configuration.anObject(config.categories)), 'must have a property "categories" of type object.' ); const categoryNames = Object.keys(config.categories); configuration.throwExceptionIf( config, configuration.not(categoryNames.length), 'must define at least one category.' ); categoryNames.forEach((name) => { const category = config.categories[name]; configuration.throwExceptionIf( config, [ configuration.not(category.appenders), configuration.not(category.level), ], `category "${name}" is not valid (must be an object with properties "appenders" and "level")` ); configuration.throwExceptionIf( config, configuration.not(Array.isArray(category.appenders)), `category "${name}" is not valid (appenders must be an array of appender names)` ); configuration.throwExceptionIf( config, configuration.not(category.appenders.length), `category "${name}" is not valid (appenders must contain at least one appender name)` ); if (Object.prototype.hasOwnProperty.call(category, 'enableCallStack')) { configuration.throwExceptionIf( config, typeof category.enableCallStack !== 'boolean', `category "${name}" is not valid (enableCallStack must be boolean type)` ); } category.appenders.forEach((appender) => { configuration.throwExceptionIf( config, configuration.not(appenders.get(appender)), `category "${name}" is not valid (appender "${appender}" is not defined)` ); }); configuration.throwExceptionIf( config, configuration.not(levels.getLevel(category.level)), `category "${name}" is not valid (level "${category.level}" not recognised;` + ` valid levels are ${levels.levels.join(', ')})` ); }); configuration.throwExceptionIf( config, configuration.not(config.categories.default), 'must define a "default" category.' ); }); const setup = (config) => { categories.clear(); if (!config) { return; } const categoryNames = Object.keys(config.categories); categoryNames.forEach((name) => { const category = config.categories[name]; const categoryAppenders = []; category.appenders.forEach((appender) => { categoryAppenders.push(appenders.get(appender)); debug(`Creating category ${name}`); categories.set(name, { appenders: categoryAppenders, level: levels.getLevel(category.level), enableCallStack: category.enableCallStack || false, }); }); }); }; const init = () => { setup(); }; init(); configuration.addListener(setup); const configForCategory = (category) => { debug(`configForCategory: searching for config for ${category}`); if (categories.has(category)) { debug(`configForCategory: ${category} exists in config, returning it`); return categories.get(category); } let sourceCategoryConfig; if (category.indexOf('.') > 0) { debug(`configForCategory: ${category} has hierarchy, cloning from parents`); sourceCategoryConfig = { ...configForCategory(category.slice(0, category.lastIndexOf('.'))), }; } else { if (!categories.has('default')) { setup({ categories: { default: { appenders: ['out'], level: 'OFF' } } }); } debug('configForCategory: cloning default category'); sourceCategoryConfig = { ...categories.get('default') }; } categories.set(category, sourceCategoryConfig); return sourceCategoryConfig; }; const appendersForCategory = (category) => configForCategory(category).appenders; const getLevelForCategory = (category) => configForCategory(category).level; const setLevelForCategory = (category, level) => { configForCategory(category).level = level; }; const getEnableCallStackForCategory = (category) => configForCategory(category).enableCallStack === true; const setEnableCallStackForCategory = (category, useCallStack) => { configForCategory(category).enableCallStack = useCallStack; }; module.exports = categories; module.exports = Object.assign(module.exports, { appendersForCategory, getLevelForCategory, setLevelForCategory, getEnableCallStackForCategory, setEnableCallStackForCategory, init, });
-1
log4js-node/log4js-node
1,334
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately
lamweili
"2022-10-01T18:55:40Z"
"2022-10-01T18:57:33Z"
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
7ca308dfde78ffc3ece68b77e30107590c61dc12
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately.
./examples/hipchat-appender.js
/** * !!! The hipchat-appender requires `hipchat-notifier` from npm, e.g. * - list as a dependency in your application's package.json || * - npm install hipchat-notifier */ const log4js = require('../lib/log4js'); log4js.configure({ appenders: { hipchat: { type: 'hipchat', hipchat_token: process.env.HIPCHAT_TOKEN || '< User token with Notification Privileges >', hipchat_room: process.env.HIPCHAT_ROOM || '< Room ID or Name >', }, }, categories: { default: { appenders: ['hipchat'], level: 'trace' }, }, }); const logger = log4js.getLogger('hipchat'); logger.warn('Test Warn message'); logger.info('Test Info message'); logger.debug('Test Debug Message'); logger.trace('Test Trace Message'); logger.fatal('Test Fatal Message'); logger.error('Test Error Message'); // alternative configuration demonstrating callback + custom layout // ///////////////////////////////////////////////////////////////// // use a custom layout function (in this case, the provided basicLayout) // format: [TIMESTAMP][LEVEL][category] - [message] log4js.configure({ appenders: { hipchat: { type: 'hipchat', hipchat_token: process.env.HIPCHAT_TOKEN || '< User token with Notification Privileges >', hipchat_room: process.env.HIPCHAT_ROOM || '< Room ID or Name >', hipchat_from: 'Mr. Semantics', hipchat_notify: false, hipchat_response_callback: function (err, response, body) { if (err || response.statusCode > 300) { throw new Error('hipchat-notifier failed'); } console.log('mr semantics callback success'); }, layout: { type: 'basic' }, }, }, categories: { default: { appenders: ['hipchat'], level: 'trace' } }, }); logger.info('Test customLayout from Mr. Semantics');
/** * !!! The hipchat-appender requires `hipchat-notifier` from npm, e.g. * - list as a dependency in your application's package.json || * - npm install hipchat-notifier */ const log4js = require('../lib/log4js'); log4js.configure({ appenders: { hipchat: { type: 'hipchat', hipchat_token: process.env.HIPCHAT_TOKEN || '< User token with Notification Privileges >', hipchat_room: process.env.HIPCHAT_ROOM || '< Room ID or Name >', }, }, categories: { default: { appenders: ['hipchat'], level: 'trace' }, }, }); const logger = log4js.getLogger('hipchat'); logger.warn('Test Warn message'); logger.info('Test Info message'); logger.debug('Test Debug Message'); logger.trace('Test Trace Message'); logger.fatal('Test Fatal Message'); logger.error('Test Error Message'); // alternative configuration demonstrating callback + custom layout // ///////////////////////////////////////////////////////////////// // use a custom layout function (in this case, the provided basicLayout) // format: [TIMESTAMP][LEVEL][category] - [message] log4js.configure({ appenders: { hipchat: { type: 'hipchat', hipchat_token: process.env.HIPCHAT_TOKEN || '< User token with Notification Privileges >', hipchat_room: process.env.HIPCHAT_ROOM || '< Room ID or Name >', hipchat_from: 'Mr. Semantics', hipchat_notify: false, hipchat_response_callback: function (err, response, body) { if (err || response.statusCode > 300) { throw new Error('hipchat-notifier failed'); } console.log('mr semantics callback success'); }, layout: { type: 'basic' }, }, }, categories: { default: { appenders: ['hipchat'], level: 'trace' } }, }); logger.info('Test customLayout from Mr. Semantics');
-1
log4js-node/log4js-node
1,334
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately
lamweili
"2022-10-01T18:55:40Z"
"2022-10-01T18:57:33Z"
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
7ca308dfde78ffc3ece68b77e30107590c61dc12
feat(log4js): if cb is passed to `shutdown()`, it must be a function or it will throw error immediately.
./.github/workflows/npm-publish.yml
# This workflow will run tests using node and then publish a package to GitHub Packages when a milestone is closed # For more information see: https://help.github.com/actions/language-and-framework-guides/publishing-nodejs-packages name: Node.js Package on: milestone: types: [closed] jobs: build: runs-on: ubuntu-latest steps: - uses: actions/checkout@v3 - uses: actions/setup-node@v3 with: node-version: 16 - run: npm ci - run: npm test - run: npm run typings publish-npm: needs: build runs-on: ubuntu-latest steps: - uses: actions/checkout@v3 - uses: actions/setup-node@v3 with: node-version: 16 registry-url: https://registry.npmjs.org/ - run: npm ci - run: | git config user.name github-actions git config user.email github-actions@github.com - run: npm version ${{ github.event.milestone.title }} - run: git push && git push --tags - run: npm publish env: NODE_AUTH_TOKEN: ${{secrets.npm_token}}
# This workflow will run tests using node and then publish a package to GitHub Packages when a milestone is closed # For more information see: https://help.github.com/actions/language-and-framework-guides/publishing-nodejs-packages name: Node.js Package on: milestone: types: [closed] jobs: build: runs-on: ubuntu-latest steps: - uses: actions/checkout@v3 - uses: actions/setup-node@v3 with: node-version: 16 - run: npm ci - run: npm test - run: npm run typings publish-npm: needs: build runs-on: ubuntu-latest steps: - uses: actions/checkout@v3 - uses: actions/setup-node@v3 with: node-version: 16 registry-url: https://registry.npmjs.org/ - run: npm ci - run: | git config user.name github-actions git config user.email github-actions@github.com - run: npm version ${{ github.event.milestone.title }} - run: git push && git push --tags - run: npm publish env: NODE_AUTH_TOKEN: ${{secrets.npm_token}}
-1
log4js-node/log4js-node
1,333
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one
lamweili
"2022-10-01T17:05:08Z"
"2022-10-01T17:20:06Z"
570ef530dc02d3e843a5421cb015bb8fadfe0b41
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one.
./lib/LoggingEvent.js
/* eslint max-classes-per-file: ["error", 2] */ const flatted = require('flatted'); const levels = require('./levels'); class SerDe { constructor() { const deserialise = { __LOG4JS_undefined__: undefined, __LOG4JS_NaN__: Number('abc'), __LOG4JS_Infinity__: 1 / 0, '__LOG4JS_-Infinity__': -1 / 0, }; this.deMap = deserialise; this.serMap = {}; Object.keys(this.deMap).forEach((key) => { const value = this.deMap[key]; this.serMap[value] = key; }); } canSerialise(key) { if (typeof key === 'string') return false; return key in this.serMap; } serialise(key) { if (this.canSerialise(key)) return this.serMap[key]; return key; } canDeserialise(key) { return key in this.deMap; } deserialise(key) { if (this.canDeserialise(key)) return this.deMap[key]; return key; } } const serde = new SerDe(); /** * @name LoggingEvent * @namespace Log4js */ class LoggingEvent { /** * Models a logging event. * @constructor * @param {string} categoryName name of category * @param {Log4js.Level} level level of message * @param {Array} data objects to log * @param {Error} [error] * @author Seth Chisamore */ constructor(categoryName, level, data, context, location, error) { this.startTime = new Date(); this.categoryName = categoryName; this.data = data; this.level = level; this.context = Object.assign({}, context); // eslint-disable-line prefer-object-spread this.pid = process.pid; this.error = error; if (location) { this.fileName = location.fileName; this.lineNumber = location.lineNumber; this.columnNumber = location.columnNumber; this.callStack = location.callStack; this.className = location.className; this.functionName = location.functionName; this.functionAlias = location.functionAlias; this.callerName = location.callerName; } } serialise() { return flatted.stringify(this, (key, value) => { // JSON.stringify(new Error('test')) returns {}, which is not really useful for us. // The following allows us to serialize errors (semi) correctly. if (value instanceof Error) { // eslint-disable-next-line prefer-object-spread value = Object.assign( { message: value.message, stack: value.stack }, value ); } // JSON.stringify({a: Number('abc'), b: 1/0, c: -1/0}) returns {a: null, b: null, c: null}. // The following allows us to serialize to NaN, Infinity and -Infinity correctly. // JSON.stringify([undefined]) returns [null]. // The following allows us to serialize to undefined correctly. return serde.serialise(value); }); } static deserialise(serialised) { let event; try { const rehydratedEvent = flatted.parse(serialised, (key, value) => { if (value && value.message && value.stack) { const fakeError = new Error(value); Object.keys(value).forEach((k) => { fakeError[k] = value[k]; }); value = fakeError; } return serde.deserialise(value); }); if ( rehydratedEvent.fileName || rehydratedEvent.lineNumber || rehydratedEvent.columnNumber || rehydratedEvent.callStack || rehydratedEvent.className || rehydratedEvent.functionName || rehydratedEvent.functionAlias || rehydratedEvent.callerName ) { rehydratedEvent.location = { fileName: rehydratedEvent.fileName, lineNumber: rehydratedEvent.lineNumber, columnNumber: rehydratedEvent.columnNumber, callStack: rehydratedEvent.callStack, className: rehydratedEvent.className, functionName: rehydratedEvent.functionName, functionAlias: rehydratedEvent.functionAlias, callerName: rehydratedEvent.callerName, }; } event = new LoggingEvent( rehydratedEvent.categoryName, levels.getLevel(rehydratedEvent.level.levelStr), rehydratedEvent.data, rehydratedEvent.context, rehydratedEvent.location, rehydratedEvent.error ); event.startTime = new Date(rehydratedEvent.startTime); event.pid = rehydratedEvent.pid; if (rehydratedEvent.cluster) { event.cluster = rehydratedEvent.cluster; } } catch (e) { event = new LoggingEvent('log4js', levels.ERROR, [ 'Unable to parse log:', serialised, 'because: ', e, ]); } return event; } } module.exports = LoggingEvent;
/* eslint max-classes-per-file: ["error", 2] */ /* eslint no-underscore-dangle: ["error", { "allow": ["_getLocationKeys"] }] */ const flatted = require('flatted'); const levels = require('./levels'); class SerDe { constructor() { const deserialise = { __LOG4JS_undefined__: undefined, __LOG4JS_NaN__: Number('abc'), __LOG4JS_Infinity__: 1 / 0, '__LOG4JS_-Infinity__': -1 / 0, }; this.deMap = deserialise; this.serMap = {}; Object.keys(this.deMap).forEach((key) => { const value = this.deMap[key]; this.serMap[value] = key; }); } canSerialise(key) { if (typeof key === 'string') return false; return key in this.serMap; } serialise(key) { if (this.canSerialise(key)) return this.serMap[key]; return key; } canDeserialise(key) { return key in this.deMap; } deserialise(key) { if (this.canDeserialise(key)) return this.deMap[key]; return key; } } const serde = new SerDe(); /** * @name LoggingEvent * @namespace Log4js */ class LoggingEvent { /** * Models a logging event. * @constructor * @param {string} categoryName name of category * @param {Log4js.Level} level level of message * @param {Array} data objects to log * @param {Error} [error] * @author Seth Chisamore */ constructor(categoryName, level, data, context, location, error) { this.startTime = new Date(); this.categoryName = categoryName; this.data = data; this.level = level; this.context = Object.assign({}, context); // eslint-disable-line prefer-object-spread this.pid = process.pid; this.error = error; if (typeof location !== 'undefined') { if (!location || typeof location !== 'object' || Array.isArray(location)) throw new TypeError( 'Invalid location type passed to LoggingEvent constructor' ); this.constructor._getLocationKeys().forEach((key) => { if (typeof location[key] !== 'undefined') this[key] = location[key]; }); } } /** @private */ static _getLocationKeys() { return [ 'fileName', 'lineNumber', 'columnNumber', 'callStack', 'className', 'functionName', 'functionAlias', 'callerName', ]; } serialise() { return flatted.stringify(this, (key, value) => { // JSON.stringify(new Error('test')) returns {}, which is not really useful for us. // The following allows us to serialize errors (semi) correctly. if (value instanceof Error) { // eslint-disable-next-line prefer-object-spread value = Object.assign( { message: value.message, stack: value.stack }, value ); } // JSON.stringify({a: Number('abc'), b: 1/0, c: -1/0}) returns {a: null, b: null, c: null}. // The following allows us to serialize to NaN, Infinity and -Infinity correctly. // JSON.stringify([undefined]) returns [null]. // The following allows us to serialize to undefined correctly. return serde.serialise(value); }); } static deserialise(serialised) { let event; try { const rehydratedEvent = flatted.parse(serialised, (key, value) => { if (value && value.message && value.stack) { const fakeError = new Error(value); Object.keys(value).forEach((k) => { fakeError[k] = value[k]; }); value = fakeError; } return serde.deserialise(value); }); this._getLocationKeys().forEach((key) => { if (typeof rehydratedEvent[key] !== 'undefined') { if (!rehydratedEvent.location) rehydratedEvent.location = {}; rehydratedEvent.location[key] = rehydratedEvent[key]; } }); event = new LoggingEvent( rehydratedEvent.categoryName, levels.getLevel(rehydratedEvent.level.levelStr), rehydratedEvent.data, rehydratedEvent.context, rehydratedEvent.location, rehydratedEvent.error ); event.startTime = new Date(rehydratedEvent.startTime); event.pid = rehydratedEvent.pid; if (rehydratedEvent.cluster) { event.cluster = rehydratedEvent.cluster; } } catch (e) { event = new LoggingEvent('log4js', levels.ERROR, [ 'Unable to parse log:', serialised, 'because: ', e, ]); } return event; } } module.exports = LoggingEvent;
1
log4js-node/log4js-node
1,333
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one
lamweili
"2022-10-01T17:05:08Z"
"2022-10-01T17:20:06Z"
570ef530dc02d3e843a5421cb015bb8fadfe0b41
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one.
./test/tap/LoggingEvent-test.js
const flatted = require('flatted'); const { test } = require('tap'); const LoggingEvent = require('../../lib/LoggingEvent'); const levels = require('../../lib/levels'); test('LoggingEvent', (batch) => { batch.test('should serialise to flatted', (t) => { const event = new LoggingEvent( 'cheese', levels.DEBUG, [ 'log message', Number('abc'), 'NaN', 1 / 0, 'Infinity', -1 / 0, '-Infinity', undefined, 'undefined', ], { user: 'bob', } ); // set the event date to a known value event.startTime = new Date(Date.UTC(2018, 1, 4, 18, 30, 23, 10)); const rehydratedEvent = flatted.parse(event.serialise()); t.equal(rehydratedEvent.startTime, '2018-02-04T18:30:23.010Z'); t.equal(rehydratedEvent.categoryName, 'cheese'); t.equal(rehydratedEvent.level.levelStr, 'DEBUG'); t.equal(rehydratedEvent.data.length, 9); t.equal(rehydratedEvent.data[0], 'log message'); t.equal(rehydratedEvent.data[1], '__LOG4JS_NaN__'); t.equal(rehydratedEvent.data[2], 'NaN'); t.equal(rehydratedEvent.data[3], '__LOG4JS_Infinity__'); t.equal(rehydratedEvent.data[4], 'Infinity'); t.equal(rehydratedEvent.data[5], '__LOG4JS_-Infinity__'); t.equal(rehydratedEvent.data[6], '-Infinity'); t.equal(rehydratedEvent.data[7], '__LOG4JS_undefined__'); t.equal(rehydratedEvent.data[8], 'undefined'); t.equal(rehydratedEvent.context.user, 'bob'); t.end(); }); batch.test('should deserialise from flatted', (t) => { const dehydratedEvent = flatted.stringify({ startTime: '2018-02-04T10:25:23.010Z', categoryName: 'biscuits', level: { levelStr: 'INFO', }, data: [ 'some log message', { x: 1 }, '__LOG4JS_NaN__', 'NaN', '__LOG4JS_Infinity__', 'Infinity', '__LOG4JS_-Infinity__', '-Infinity', '__LOG4JS_undefined__', 'undefined', ], context: { thing: 'otherThing' }, pid: '1234', functionName: 'bound', fileName: 'domain.js', lineNumber: 421, columnNumber: 15, callStack: 'at bound (domain.js:421:15)\n', }); const event = LoggingEvent.deserialise(dehydratedEvent); t.type(event, LoggingEvent); t.same(event.startTime, new Date(Date.UTC(2018, 1, 4, 10, 25, 23, 10))); t.equal(event.categoryName, 'biscuits'); t.same(event.level, levels.INFO); t.equal(event.data.length, 10); t.equal(event.data[0], 'some log message'); t.equal(event.data[1].x, 1); t.ok(Number.isNaN(event.data[2])); t.equal(event.data[3], 'NaN'); t.equal(event.data[4], 1 / 0); t.equal(event.data[5], 'Infinity'); t.equal(event.data[6], -1 / 0); t.equal(event.data[7], '-Infinity'); t.equal(event.data[8], undefined); t.equal(event.data[9], 'undefined'); t.equal(event.context.thing, 'otherThing'); t.equal(event.pid, '1234'); t.equal(event.functionName, 'bound'); t.equal(event.fileName, 'domain.js'); t.equal(event.lineNumber, 421); t.equal(event.columnNumber, 15); t.equal(event.callStack, 'at bound (domain.js:421:15)\n'); t.end(); }); batch.test('Should correct construct with/without location info', (t) => { // console.log([Error('123').stack.split('\n').slice(1).join('\n')]) const callStack = ' at repl:1:14\n at ContextifyScript.Script.runInThisContext (vm.js:50:33)\n at REPLServer.defaultEval (repl.js:240:29)\n at bound (domain.js:301:14)\n at REPLServer.runBound [as eval] (domain.js:314:12)\n at REPLServer.onLine (repl.js:468:10)\n at emitOne (events.js:121:20)\n at REPLServer.emit (events.js:211:7)\n at REPLServer.Interface._onLine (readline.js:280:10)\n at REPLServer.Interface._line (readline.js:629:8)'; // eslint-disable-line max-len const fileName = '/log4js-node/test/tap/layouts-test.js'; const lineNumber = 1; const columnNumber = 14; const className = ''; const functionName = ''; const functionAlias = ''; const callerName = ''; const location = { fileName, lineNumber, columnNumber, callStack, className, functionName, functionAlias, callerName, }; const event = new LoggingEvent( 'cheese', levels.DEBUG, ['log message'], { user: 'bob' }, location ); t.equal(event.fileName, fileName); t.equal(event.lineNumber, lineNumber); t.equal(event.columnNumber, columnNumber); t.equal(event.callStack, callStack); t.equal(event.className, className); t.equal(event.functionName, functionName); t.equal(event.functionAlias, functionAlias); t.equal(event.callerName, callerName); const event2 = new LoggingEvent('cheese', levels.DEBUG, ['log message'], { user: 'bob', }); t.equal(event2.fileName, undefined); t.equal(event2.lineNumber, undefined); t.equal(event2.columnNumber, undefined); t.equal(event2.callStack, undefined); t.equal(event2.className, undefined); t.equal(event2.functionName, undefined); t.equal(event2.functionAlias, undefined); t.equal(event2.callerName, undefined); t.end(); }); batch.test('Should contain class, method and alias names', (t) => { // console.log([Error('123').stack.split('\n').slice(1).join('\n')]) const callStack = ' at Foo.bar [as baz] (repl:1:14)\n at ContextifyScript.Script.runInThisContext (vm.js:50:33)\n at REPLServer.defaultEval (repl.js:240:29)\n at bound (domain.js:301:14)\n at REPLServer.runBound [as eval] (domain.js:314:12)\n at REPLServer.onLine (repl.js:468:10)\n at emitOne (events.js:121:20)\n at REPLServer.emit (events.js:211:7)\n at REPLServer.Interface._onLine (readline.js:280:10)\n at REPLServer.Interface._line (readline.js:629:8)'; // eslint-disable-line max-len const fileName = '/log4js-node/test/tap/layouts-test.js'; const lineNumber = 1; const columnNumber = 14; const className = 'Foo'; const functionName = 'bar'; const functionAlias = 'baz'; const callerName = 'Foo.bar [as baz]'; const location = { fileName, lineNumber, columnNumber, callStack, className, functionName, functionAlias, callerName, }; const event = new LoggingEvent( 'cheese', levels.DEBUG, ['log message'], { user: 'bob' }, location ); t.equal(event.fileName, fileName); t.equal(event.lineNumber, lineNumber); t.equal(event.columnNumber, columnNumber); t.equal(event.callStack, callStack); t.equal(event.className, className); t.equal(event.functionName, functionName); t.equal(event.functionAlias, functionAlias); t.equal(event.callerName, callerName); t.end(); }); batch.test('Should correctly serialize and deserialize', (t) => { const error = new Error('test'); const location = { fileName: __filename, lineNumber: 123, columnNumber: 52, callStack: error.stack, className: 'Foo', functionName: 'test', functionAlias: 'baz', callerName: 'Foo.test [as baz]', }; const event = new LoggingEvent( 'cheese', levels.DEBUG, [ error, 'log message', Number('abc'), 'NaN', 1 / 0, 'Infinity', -1 / 0, '-Infinity', undefined, 'undefined', ], { user: 'bob', }, location, error ); const event2 = LoggingEvent.deserialise(event.serialise()); t.match(event2, event); t.end(); }); batch.end(); });
const flatted = require('flatted'); const { test } = require('tap'); const LoggingEvent = require('../../lib/LoggingEvent'); const levels = require('../../lib/levels'); test('LoggingEvent', (batch) => { batch.test('should throw error for invalid location', (t) => { t.throws( () => new LoggingEvent( 'cheese', levels.DEBUG, ['log message'], undefined, [] ), 'Invalid location type passed to LoggingEvent constructor' ); t.end(); }); batch.test('should serialise to flatted', (t) => { const event = new LoggingEvent( 'cheese', levels.DEBUG, [ 'log message', Number('abc'), 'NaN', 1 / 0, 'Infinity', -1 / 0, '-Infinity', undefined, 'undefined', ], { user: 'bob', } ); // set the event date to a known value event.startTime = new Date(Date.UTC(2018, 1, 4, 18, 30, 23, 10)); const rehydratedEvent = flatted.parse(event.serialise()); t.equal(rehydratedEvent.startTime, '2018-02-04T18:30:23.010Z'); t.equal(rehydratedEvent.categoryName, 'cheese'); t.equal(rehydratedEvent.level.levelStr, 'DEBUG'); t.equal(rehydratedEvent.data.length, 9); t.equal(rehydratedEvent.data[0], 'log message'); t.equal(rehydratedEvent.data[1], '__LOG4JS_NaN__'); t.equal(rehydratedEvent.data[2], 'NaN'); t.equal(rehydratedEvent.data[3], '__LOG4JS_Infinity__'); t.equal(rehydratedEvent.data[4], 'Infinity'); t.equal(rehydratedEvent.data[5], '__LOG4JS_-Infinity__'); t.equal(rehydratedEvent.data[6], '-Infinity'); t.equal(rehydratedEvent.data[7], '__LOG4JS_undefined__'); t.equal(rehydratedEvent.data[8], 'undefined'); t.equal(rehydratedEvent.context.user, 'bob'); t.end(); }); batch.test('should deserialise from flatted', (t) => { const dehydratedEvent = flatted.stringify({ startTime: '2018-02-04T10:25:23.010Z', categoryName: 'biscuits', level: { levelStr: 'INFO', }, data: [ 'some log message', { x: 1 }, '__LOG4JS_NaN__', 'NaN', '__LOG4JS_Infinity__', 'Infinity', '__LOG4JS_-Infinity__', '-Infinity', '__LOG4JS_undefined__', 'undefined', ], context: { thing: 'otherThing' }, pid: '1234', functionName: 'bound', fileName: 'domain.js', lineNumber: 421, columnNumber: 15, callStack: 'at bound (domain.js:421:15)\n', }); const event = LoggingEvent.deserialise(dehydratedEvent); t.type(event, LoggingEvent); t.same(event.startTime, new Date(Date.UTC(2018, 1, 4, 10, 25, 23, 10))); t.equal(event.categoryName, 'biscuits'); t.same(event.level, levels.INFO); t.equal(event.data.length, 10); t.equal(event.data[0], 'some log message'); t.equal(event.data[1].x, 1); t.ok(Number.isNaN(event.data[2])); t.equal(event.data[3], 'NaN'); t.equal(event.data[4], 1 / 0); t.equal(event.data[5], 'Infinity'); t.equal(event.data[6], -1 / 0); t.equal(event.data[7], '-Infinity'); t.equal(event.data[8], undefined); t.equal(event.data[9], 'undefined'); t.equal(event.context.thing, 'otherThing'); t.equal(event.pid, '1234'); t.equal(event.functionName, 'bound'); t.equal(event.fileName, 'domain.js'); t.equal(event.lineNumber, 421); t.equal(event.columnNumber, 15); t.equal(event.callStack, 'at bound (domain.js:421:15)\n'); t.end(); }); batch.test('Should correct construct with/without location info', (t) => { // console.log([Error('123').stack.split('\n').slice(1).join('\n')]) const callStack = ' at repl:1:14\n at ContextifyScript.Script.runInThisContext (vm.js:50:33)\n at REPLServer.defaultEval (repl.js:240:29)\n at bound (domain.js:301:14)\n at REPLServer.runBound [as eval] (domain.js:314:12)\n at REPLServer.onLine (repl.js:468:10)\n at emitOne (events.js:121:20)\n at REPLServer.emit (events.js:211:7)\n at REPLServer.Interface._onLine (readline.js:280:10)\n at REPLServer.Interface._line (readline.js:629:8)'; // eslint-disable-line max-len const fileName = '/log4js-node/test/tap/layouts-test.js'; const lineNumber = 1; const columnNumber = 14; const className = ''; const functionName = ''; const functionAlias = ''; const callerName = ''; const location = { fileName, lineNumber, columnNumber, callStack, className, functionName, functionAlias, callerName, }; const event = new LoggingEvent( 'cheese', levels.DEBUG, ['log message'], { user: 'bob' }, location ); t.equal(event.fileName, fileName); t.equal(event.lineNumber, lineNumber); t.equal(event.columnNumber, columnNumber); t.equal(event.callStack, callStack); t.equal(event.className, className); t.equal(event.functionName, functionName); t.equal(event.functionAlias, functionAlias); t.equal(event.callerName, callerName); const event2 = new LoggingEvent('cheese', levels.DEBUG, ['log message'], { user: 'bob', }); t.equal(event2.fileName, undefined); t.equal(event2.lineNumber, undefined); t.equal(event2.columnNumber, undefined); t.equal(event2.callStack, undefined); t.equal(event2.className, undefined); t.equal(event2.functionName, undefined); t.equal(event2.functionAlias, undefined); t.equal(event2.callerName, undefined); t.end(); }); batch.test('Should contain class, method and alias names', (t) => { // console.log([Error('123').stack.split('\n').slice(1).join('\n')]) const callStack = ' at Foo.bar [as baz] (repl:1:14)\n at ContextifyScript.Script.runInThisContext (vm.js:50:33)\n at REPLServer.defaultEval (repl.js:240:29)\n at bound (domain.js:301:14)\n at REPLServer.runBound [as eval] (domain.js:314:12)\n at REPLServer.onLine (repl.js:468:10)\n at emitOne (events.js:121:20)\n at REPLServer.emit (events.js:211:7)\n at REPLServer.Interface._onLine (readline.js:280:10)\n at REPLServer.Interface._line (readline.js:629:8)'; // eslint-disable-line max-len const fileName = '/log4js-node/test/tap/layouts-test.js'; const lineNumber = 1; const columnNumber = 14; const className = 'Foo'; const functionName = 'bar'; const functionAlias = 'baz'; const callerName = 'Foo.bar [as baz]'; const location = { fileName, lineNumber, columnNumber, callStack, className, functionName, functionAlias, callerName, }; const event = new LoggingEvent( 'cheese', levels.DEBUG, ['log message'], { user: 'bob' }, location ); t.equal(event.fileName, fileName); t.equal(event.lineNumber, lineNumber); t.equal(event.columnNumber, columnNumber); t.equal(event.callStack, callStack); t.equal(event.className, className); t.equal(event.functionName, functionName); t.equal(event.functionAlias, functionAlias); t.equal(event.callerName, callerName); t.end(); }); batch.test('Should correctly serialize and deserialize', (t) => { const error = new Error('test'); const location = { fileName: __filename, lineNumber: 123, columnNumber: 52, callStack: error.stack, className: 'Foo', functionName: 'test', functionAlias: 'baz', callerName: 'Foo.test [as baz]', }; const event = new LoggingEvent( 'cheese', levels.DEBUG, [ error, 'log message', Number('abc'), 'NaN', 1 / 0, 'Infinity', -1 / 0, '-Infinity', undefined, 'undefined', ], { user: 'bob', }, location, error ); const event2 = LoggingEvent.deserialise(event.serialise()); t.match(event2, event); t.end(); }); batch.end(); });
1
log4js-node/log4js-node
1,333
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one
lamweili
"2022-10-01T17:05:08Z"
"2022-10-01T17:20:06Z"
570ef530dc02d3e843a5421cb015bb8fadfe0b41
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one.
./test/tap/logger-test.js
const { test } = require('tap'); const debug = require('debug')('log4js:test.logger'); const sandbox = require('@log4js-node/sandboxed-module'); const callsites = require('callsites'); const levels = require('../../lib/levels'); const categories = require('../../lib/categories'); /** @type {import('../../types/log4js').LoggingEvent[]} */ const events = []; /** @type {string[]} */ const messages = []; /** * @typedef {import('../../types/log4js').Logger} LoggerClass */ /** @type {{new (): LoggerClass}} */ const Logger = sandbox.require('../../lib/logger', { requires: { './levels': levels, './categories': categories, './clustering': { isMaster: () => true, onlyOnMaster: (fn) => fn(), send: (evt) => { debug('fake clustering got event:', evt); events.push(evt); }, }, }, globals: { console: { ...console, error(msg) { messages.push(msg); }, }, }, }); const testConfig = { level: levels.TRACE, }; test('../../lib/logger', (batch) => { batch.beforeEach((done) => { events.length = 0; testConfig.level = levels.TRACE; if (typeof done === 'function') { done(); } }); batch.test('constructor with no parameters', (t) => { t.throws(() => new Logger(), new Error('No category provided.')); t.end(); }); batch.test('constructor with category', (t) => { const logger = new Logger('cheese'); t.equal(logger.category, 'cheese', 'should use category'); t.equal(logger.level, levels.OFF, 'should use OFF log level'); t.end(); }); batch.test('set level should delegate', (t) => { const logger = new Logger('cheese'); logger.level = 'debug'; t.equal(logger.category, 'cheese', 'should use category'); t.equal(logger.level, levels.DEBUG, 'should use level'); t.end(); }); batch.test('isLevelEnabled', (t) => { const logger = new Logger('cheese'); const functions = [ 'isTraceEnabled', 'isDebugEnabled', 'isInfoEnabled', 'isWarnEnabled', 'isErrorEnabled', 'isFatalEnabled', ]; t.test( 'should provide a level enabled function for all levels', (subtest) => { subtest.plan(functions.length); functions.forEach((fn) => { subtest.type(logger[fn], 'function'); }); } ); logger.level = 'INFO'; t.notOk(logger.isTraceEnabled()); t.notOk(logger.isDebugEnabled()); t.ok(logger.isInfoEnabled()); t.ok(logger.isWarnEnabled()); t.ok(logger.isErrorEnabled()); t.ok(logger.isFatalEnabled()); t.end(); }); batch.test('should send log events to dispatch function', (t) => { const logger = new Logger('cheese'); logger.level = 'debug'; logger.debug('Event 1'); logger.debug('Event 2'); logger.debug('Event 3'); t.equal(events.length, 3); t.equal(events[0].data[0], 'Event 1'); t.equal(events[1].data[0], 'Event 2'); t.equal(events[2].data[0], 'Event 3'); t.end(); }); batch.test('should add context values to every event', (t) => { const logger = new Logger('fromage'); logger.level = 'debug'; logger.debug('Event 1'); logger.addContext('cheese', 'edam'); logger.debug('Event 2'); logger.debug('Event 3'); logger.addContext('biscuits', 'timtam'); logger.debug('Event 4'); logger.removeContext('cheese'); logger.debug('Event 5'); logger.clearContext(); logger.debug('Event 6'); t.equal(events.length, 6); t.same(events[0].context, {}); t.same(events[1].context, { cheese: 'edam' }); t.same(events[2].context, { cheese: 'edam' }); t.same(events[3].context, { cheese: 'edam', biscuits: 'timtam' }); t.same(events[4].context, { biscuits: 'timtam' }); t.same(events[5].context, {}); t.end(); }); batch.test('should not break when log data has no toString', (t) => { const logger = new Logger('thing'); logger.level = 'debug'; logger.info('Just testing ', Object.create(null)); t.equal(events.length, 1); t.end(); }); batch.test( 'default should disable useCallStack unless manual enable', (t) => { const logger = new Logger('stack'); logger.level = 'debug'; t.equal(logger.useCallStack, false); logger.debug('test no callStack'); let event = events.shift(); t.notMatch(event, { functionName: String }); t.notMatch(event, { fileName: String }); t.notMatch(event, { lineNumber: Number }); t.notMatch(event, { columnNumber: Number }); t.notMatch(event, { callStack: String }); logger.useCallStack = false; t.equal(logger.useCallStack, false); logger.useCallStack = 0; t.equal(logger.useCallStack, false); logger.useCallStack = ''; t.equal(logger.useCallStack, false); logger.useCallStack = null; t.equal(logger.useCallStack, false); logger.useCallStack = undefined; t.equal(logger.useCallStack, false); logger.useCallStack = 'true'; t.equal(logger.useCallStack, false); logger.useCallStack = true; t.equal(logger.useCallStack, true); logger.debug('test with callStack'); event = events.shift(); t.match(event, { functionName: String, fileName: String, lineNumber: Number, columnNumber: Number, callStack: String, }); t.end(); } ); batch.test('should correctly switch on/off useCallStack', (t) => { const logger = new Logger('stack'); logger.level = 'debug'; logger.useCallStack = true; t.equal(logger.useCallStack, true); logger.info('hello world'); const callsite = callsites()[0]; t.equal(events.length, 1); t.equal(events[0].data[0], 'hello world'); t.equal(events[0].fileName, callsite.getFileName()); t.equal(events[0].lineNumber, callsite.getLineNumber() - 1); t.equal(events[0].columnNumber, 12); logger.useCallStack = false; logger.info('disabled'); t.equal(logger.useCallStack, false); t.equal(events[1].data[0], 'disabled'); t.equal(events[1].fileName, undefined); t.equal(events[1].lineNumber, undefined); t.equal(events[1].columnNumber, undefined); t.end(); }); batch.test( 'Once switch on/off useCallStack will apply all same category loggers', (t) => { const logger1 = new Logger('stack'); logger1.level = 'debug'; logger1.useCallStack = true; const logger2 = new Logger('stack'); logger2.level = 'debug'; logger1.info('hello world'); const callsite = callsites()[0]; t.equal(logger1.useCallStack, true); t.equal(events.length, 1); t.equal(events[0].data[0], 'hello world'); t.equal(events[0].fileName, callsite.getFileName()); t.equal(events[0].lineNumber, callsite.getLineNumber() - 1); t.equal(events[0].columnNumber, 15); // col of the '.' in logger1.info(...) logger2.info('hello world'); const callsite2 = callsites()[0]; t.equal(logger2.useCallStack, true); t.equal(events[1].data[0], 'hello world'); t.equal(events[1].fileName, callsite2.getFileName()); t.equal(events[1].lineNumber, callsite2.getLineNumber() - 1); t.equal(events[1].columnNumber, 15); // col of the '.' in logger1.info(...) logger1.useCallStack = false; logger2.info('hello world'); t.equal(logger2.useCallStack, false); t.equal(events[2].data[0], 'hello world'); t.equal(events[2].fileName, undefined); t.equal(events[2].lineNumber, undefined); t.equal(events[2].columnNumber, undefined); t.end(); } ); batch.test('parseCallStack function coverage', (t) => { const logger = new Logger('stack'); logger.useCallStack = true; let results; results = logger.parseCallStack(new Error()); t.ok(results); t.equal(messages.length, 0, 'should not have error'); results = logger.parseCallStack(''); t.notOk(results); t.equal(messages.length, 1, 'should have error'); results = logger.parseCallStack(new Error(), 100); t.equal(results, null); t.end(); }); batch.test('parseCallStack names extraction', (t) => { const logger = new Logger('stack'); logger.useCallStack = true; let results; const callStack1 = ' at Foo.bar [as baz] (repl:1:14)\n at ContextifyScript.Script.runInThisContext (vm.js:50:33)\n at REPLServer.defaultEval (repl.js:240:29)\n at bound (domain.js:301:14)\n at REPLServer.runBound [as eval] (domain.js:314:12)\n at REPLServer.onLine (repl.js:468:10)\n at emitOne (events.js:121:20)\n at REPLServer.emit (events.js:211:7)\n at REPLServer.Interface._onLine (readline.js:280:10)\n at REPLServer.Interface._line (readline.js:629:8)'; // eslint-disable-line max-len results = logger.parseCallStack({ stack: callStack1 }, 0); t.ok(results); t.equal(results.className, 'Foo'); t.equal(results.functionName, 'bar'); t.equal(results.functionAlias, 'baz'); t.equal(results.callerName, 'Foo.bar [as baz]'); const callStack2 = ' at bar [as baz] (repl:1:14)\n at ContextifyScript.Script.runInThisContext (vm.js:50:33)\n at REPLServer.defaultEval (repl.js:240:29)\n at bound (domain.js:301:14)\n at REPLServer.runBound [as eval] (domain.js:314:12)\n at REPLServer.onLine (repl.js:468:10)\n at emitOne (events.js:121:20)\n at REPLServer.emit (events.js:211:7)\n at REPLServer.Interface._onLine (readline.js:280:10)\n at REPLServer.Interface._line (readline.js:629:8)'; // eslint-disable-line max-len results = logger.parseCallStack({ stack: callStack2 }, 0); t.ok(results); t.equal(results.className, ''); t.equal(results.functionName, 'bar'); t.equal(results.functionAlias, 'baz'); t.equal(results.callerName, 'bar [as baz]'); const callStack3 = ' at bar (repl:1:14)\n at ContextifyScript.Script.runInThisContext (vm.js:50:33)\n at REPLServer.defaultEval (repl.js:240:29)\n at bound (domain.js:301:14)\n at REPLServer.runBound [as eval] (domain.js:314:12)\n at REPLServer.onLine (repl.js:468:10)\n at emitOne (events.js:121:20)\n at REPLServer.emit (events.js:211:7)\n at REPLServer.Interface._onLine (readline.js:280:10)\n at REPLServer.Interface._line (readline.js:629:8)'; // eslint-disable-line max-len results = logger.parseCallStack({ stack: callStack3 }, 0); t.ok(results); t.equal(results.className, ''); t.equal(results.functionName, 'bar'); t.equal(results.functionAlias, ''); t.equal(results.callerName, 'bar'); const callStack4 = ' at repl:1:14\n at ContextifyScript.Script.runInThisContext (vm.js:50:33)\n at REPLServer.defaultEval (repl.js:240:29)\n at bound (domain.js:301:14)\n at REPLServer.runBound [as eval] (domain.js:314:12)\n at REPLServer.onLine (repl.js:468:10)\n at emitOne (events.js:121:20)\n at REPLServer.emit (events.js:211:7)\n at REPLServer.Interface._onLine (readline.js:280:10)\n at REPLServer.Interface._line (readline.js:629:8)'; // eslint-disable-line max-len results = logger.parseCallStack({ stack: callStack4 }, 0); t.ok(results); t.equal(results.className, ''); t.equal(results.functionName, ''); t.equal(results.functionAlias, ''); t.equal(results.callerName, ''); const callStack5 = ' at Foo.bar (repl:1:14)\n at ContextifyScript.Script.runInThisContext (vm.js:50:33)\n at REPLServer.defaultEval (repl.js:240:29)\n at bound (domain.js:301:14)\n at REPLServer.runBound [as eval] (domain.js:314:12)\n at REPLServer.onLine (repl.js:468:10)\n at emitOne (events.js:121:20)\n at REPLServer.emit (events.js:211:7)\n at REPLServer.Interface._onLine (readline.js:280:10)\n at REPLServer.Interface._line (readline.js:629:8)'; // eslint-disable-line max-len results = logger.parseCallStack({ stack: callStack5 }, 0); t.ok(results); t.equal(results.className, 'Foo'); t.equal(results.functionName, 'bar'); t.equal(results.functionAlias, ''); t.equal(results.callerName, 'Foo.bar'); t.end(); }); batch.test('should correctly change the parseCallStack function', (t) => { const logger = new Logger('stack'); logger.level = 'debug'; logger.useCallStack = true; logger.info('test defaultParseCallStack'); const initialEvent = events.shift(); const parseFunction = function () { return { functionName: 'test function name', fileName: 'test file name', lineNumber: 15, columnNumber: 25, callStack: 'test callstack', }; }; logger.setParseCallStackFunction(parseFunction); t.equal(logger.parseCallStack, parseFunction); logger.info('test parseCallStack'); t.equal(events[0].functionName, 'test function name'); t.equal(events[0].fileName, 'test file name'); t.equal(events[0].lineNumber, 15); t.equal(events[0].columnNumber, 25); t.equal(events[0].callStack, 'test callstack'); events.shift(); logger.setParseCallStackFunction(undefined); logger.info('test restoredDefaultParseCallStack'); t.equal(events[0].functionName, initialEvent.functionName); t.equal(events[0].fileName, initialEvent.fileName); t.equal(events[0].columnNumber, initialEvent.columnNumber); t.throws(() => logger.setParseCallStackFunction('not a function')); t.end(); }); batch.test('should correctly change the stack levels to skip', (t) => { const logger = new Logger('stack'); logger.level = 'debug'; logger.useCallStack = true; t.equal( logger.callStackLinesToSkip, 0, 'initial callStackLinesToSkip changed' ); logger.info('get initial stack'); const initialEvent = events.shift(); const newStackSkip = 1; logger.callStackLinesToSkip = newStackSkip; t.equal(logger.callStackLinesToSkip, newStackSkip); logger.info('test stack skip'); const event = events.shift(); t.not(event.functionName, initialEvent.functionName); t.not(event.fileName, initialEvent.fileName); t.equal( event.callStack, initialEvent.callStack.split('\n').slice(newStackSkip).join('\n') ); t.throws(() => { logger.callStackLinesToSkip = -1; }); t.throws(() => { logger.callStackLinesToSkip = '2'; }); t.end(); }); batch.test('should utilize the first Error data value', (t) => { const logger = new Logger('stack'); logger.level = 'debug'; logger.useCallStack = true; const error = new Error(); logger.info(error); const event = events.shift(); t.equal(event.error, error); logger.info(error); t.match(event, events.shift()); logger.callStackLinesToSkip = 1; logger.info(error); const event2 = events.shift(); t.equal(event2.callStack, event.callStack.split('\n').slice(1).join('\n')); logger.callStackLinesToSkip = 0; logger.info('hi', error); const event3 = events.shift(); t.equal(event3.callStack, event.callStack); t.equal(event3.error, error); logger.info('hi', error, new Error()); const event4 = events.shift(); t.equal(event4.callStack, event.callStack); t.equal(event4.error, error); t.end(); }); batch.test('creating/cloning of category', (t) => { const defaultLogger = new Logger('default'); defaultLogger.level = 'trace'; defaultLogger.useCallStack = true; t.test( 'category should be cloned from parent/default if does not exist', (assert) => { const originalLength = categories.size; const logger = new Logger('cheese1'); assert.equal( categories.size, originalLength + 1, 'category should be cloned' ); assert.equal( logger.level, levels.TRACE, 'should inherit level=TRACE from default-category' ); assert.equal( logger.useCallStack, true, 'should inherit useCallStack=true from default-category' ); assert.end(); } ); t.test( 'changing level should not impact default-category or useCallStack', (assert) => { const logger = new Logger('cheese2'); logger.level = 'debug'; assert.equal( logger.level, levels.DEBUG, 'should be changed to level=DEBUG' ); assert.equal( defaultLogger.level, levels.TRACE, 'default-category should remain as level=TRACE' ); assert.equal( logger.useCallStack, true, 'should remain as useCallStack=true' ); assert.equal( defaultLogger.useCallStack, true, 'default-category should remain as useCallStack=true' ); assert.end(); } ); t.test( 'changing useCallStack should not impact default-category or level', (assert) => { const logger = new Logger('cheese3'); logger.useCallStack = false; assert.equal( logger.useCallStack, false, 'should be changed to useCallStack=false' ); assert.equal( defaultLogger.useCallStack, true, 'default-category should remain as useCallStack=true' ); assert.equal( logger.level, levels.TRACE, 'should remain as level=TRACE' ); assert.equal( defaultLogger.level, levels.TRACE, 'default-category should remain as level=TRACE' ); assert.end(); } ); t.end(); }); batch.end(); });
const { test } = require('tap'); const debug = require('debug')('log4js:test.logger'); const sandbox = require('@log4js-node/sandboxed-module'); const callsites = require('callsites'); const levels = require('../../lib/levels'); const categories = require('../../lib/categories'); /** @type {import('../../types/log4js').LoggingEvent[]} */ const events = []; /** @type {string[]} */ const messages = []; /** * @typedef {import('../../types/log4js').Logger} LoggerClass */ /** @type {{new (): LoggerClass}} */ const Logger = sandbox.require('../../lib/logger', { requires: { './levels': levels, './categories': categories, './clustering': { isMaster: () => true, onlyOnMaster: (fn) => fn(), send: (evt) => { debug('fake clustering got event:', evt); events.push(evt); }, }, }, globals: { console: { ...console, error(msg) { messages.push(msg); }, }, }, }); const testConfig = { level: levels.TRACE, }; test('../../lib/logger', (batch) => { batch.beforeEach((done) => { events.length = 0; testConfig.level = levels.TRACE; if (typeof done === 'function') { done(); } }); batch.test('constructor with no parameters', (t) => { t.throws(() => new Logger(), new Error('No category provided.')); t.end(); }); batch.test('constructor with category', (t) => { const logger = new Logger('cheese'); t.equal(logger.category, 'cheese', 'should use category'); t.equal(logger.level, levels.OFF, 'should use OFF log level'); t.end(); }); batch.test('set level should delegate', (t) => { const logger = new Logger('cheese'); logger.level = 'debug'; t.equal(logger.category, 'cheese', 'should use category'); t.equal(logger.level, levels.DEBUG, 'should use level'); t.end(); }); batch.test('isLevelEnabled', (t) => { const logger = new Logger('cheese'); const functions = [ 'isTraceEnabled', 'isDebugEnabled', 'isInfoEnabled', 'isWarnEnabled', 'isErrorEnabled', 'isFatalEnabled', ]; t.test( 'should provide a level enabled function for all levels', (subtest) => { subtest.plan(functions.length); functions.forEach((fn) => { subtest.type(logger[fn], 'function'); }); } ); logger.level = 'INFO'; t.notOk(logger.isTraceEnabled()); t.notOk(logger.isDebugEnabled()); t.ok(logger.isInfoEnabled()); t.ok(logger.isWarnEnabled()); t.ok(logger.isErrorEnabled()); t.ok(logger.isFatalEnabled()); t.end(); }); batch.test('should send log events to dispatch function', (t) => { const logger = new Logger('cheese'); logger.level = 'debug'; logger.debug('Event 1'); logger.debug('Event 2'); logger.debug('Event 3'); t.equal(events.length, 3); t.equal(events[0].data[0], 'Event 1'); t.equal(events[1].data[0], 'Event 2'); t.equal(events[2].data[0], 'Event 3'); t.end(); }); batch.test('should add context values to every event', (t) => { const logger = new Logger('fromage'); logger.level = 'debug'; logger.debug('Event 1'); logger.addContext('cheese', 'edam'); logger.debug('Event 2'); logger.debug('Event 3'); logger.addContext('biscuits', 'timtam'); logger.debug('Event 4'); logger.removeContext('cheese'); logger.debug('Event 5'); logger.clearContext(); logger.debug('Event 6'); t.equal(events.length, 6); t.same(events[0].context, {}); t.same(events[1].context, { cheese: 'edam' }); t.same(events[2].context, { cheese: 'edam' }); t.same(events[3].context, { cheese: 'edam', biscuits: 'timtam' }); t.same(events[4].context, { biscuits: 'timtam' }); t.same(events[5].context, {}); t.end(); }); batch.test('should not break when log data has no toString', (t) => { const logger = new Logger('thing'); logger.level = 'debug'; logger.info('Just testing ', Object.create(null)); t.equal(events.length, 1); t.end(); }); batch.test( 'default should disable useCallStack unless manual enable', (t) => { const logger = new Logger('stack'); logger.level = 'debug'; t.equal(logger.useCallStack, false); logger.debug('test no callStack'); let event = events.shift(); t.notMatch(event, { functionName: String }); t.notMatch(event, { fileName: String }); t.notMatch(event, { lineNumber: Number }); t.notMatch(event, { columnNumber: Number }); t.notMatch(event, { callStack: String }); logger.useCallStack = false; t.equal(logger.useCallStack, false); logger.useCallStack = 0; t.equal(logger.useCallStack, false); logger.useCallStack = ''; t.equal(logger.useCallStack, false); logger.useCallStack = null; t.equal(logger.useCallStack, false); logger.useCallStack = undefined; t.equal(logger.useCallStack, false); logger.useCallStack = 'true'; t.equal(logger.useCallStack, false); logger.useCallStack = true; t.equal(logger.useCallStack, true); logger.debug('test with callStack'); event = events.shift(); t.match(event, { functionName: String, fileName: String, lineNumber: Number, columnNumber: Number, callStack: String, }); t.end(); } ); batch.test('should correctly switch on/off useCallStack', (t) => { const logger = new Logger('stack'); logger.level = 'debug'; logger.useCallStack = true; t.equal(logger.useCallStack, true); logger.info('hello world'); const callsite = callsites()[0]; t.equal(events.length, 1); t.equal(events[0].data[0], 'hello world'); t.equal(events[0].fileName, callsite.getFileName()); t.equal(events[0].lineNumber, callsite.getLineNumber() - 1); t.equal(events[0].columnNumber, 12); logger.useCallStack = false; logger.info('disabled'); t.equal(logger.useCallStack, false); t.equal(events[1].data[0], 'disabled'); t.equal(events[1].fileName, undefined); t.equal(events[1].lineNumber, undefined); t.equal(events[1].columnNumber, undefined); t.end(); }); batch.test( 'Once switch on/off useCallStack will apply all same category loggers', (t) => { const logger1 = new Logger('stack'); logger1.level = 'debug'; logger1.useCallStack = true; const logger2 = new Logger('stack'); logger2.level = 'debug'; logger1.info('hello world'); const callsite = callsites()[0]; t.equal(logger1.useCallStack, true); t.equal(events.length, 1); t.equal(events[0].data[0], 'hello world'); t.equal(events[0].fileName, callsite.getFileName()); t.equal(events[0].lineNumber, callsite.getLineNumber() - 1); t.equal(events[0].columnNumber, 15); // col of the '.' in logger1.info(...) logger2.info('hello world'); const callsite2 = callsites()[0]; t.equal(logger2.useCallStack, true); t.equal(events[1].data[0], 'hello world'); t.equal(events[1].fileName, callsite2.getFileName()); t.equal(events[1].lineNumber, callsite2.getLineNumber() - 1); t.equal(events[1].columnNumber, 15); // col of the '.' in logger1.info(...) logger1.useCallStack = false; logger2.info('hello world'); t.equal(logger2.useCallStack, false); t.equal(events[2].data[0], 'hello world'); t.equal(events[2].fileName, undefined); t.equal(events[2].lineNumber, undefined); t.equal(events[2].columnNumber, undefined); t.end(); } ); batch.test('parseCallStack function coverage', (t) => { const logger = new Logger('stack'); logger.useCallStack = true; let results; results = logger.parseCallStack(new Error()); t.ok(results); t.equal(messages.length, 0, 'should not have error'); results = logger.parseCallStack(''); t.notOk(results); t.equal(messages.length, 1, 'should have error'); results = logger.parseCallStack(new Error(), 100); t.equal(results, null); t.end(); }); batch.test('parseCallStack names extraction', (t) => { const logger = new Logger('stack'); logger.useCallStack = true; let results; const callStack1 = ' at Foo.bar [as baz] (repl:1:14)\n at ContextifyScript.Script.runInThisContext (vm.js:50:33)\n at REPLServer.defaultEval (repl.js:240:29)\n at bound (domain.js:301:14)\n at REPLServer.runBound [as eval] (domain.js:314:12)\n at REPLServer.onLine (repl.js:468:10)\n at emitOne (events.js:121:20)\n at REPLServer.emit (events.js:211:7)\n at REPLServer.Interface._onLine (readline.js:280:10)\n at REPLServer.Interface._line (readline.js:629:8)'; // eslint-disable-line max-len results = logger.parseCallStack({ stack: callStack1 }, 0); t.ok(results); t.equal(results.className, 'Foo'); t.equal(results.functionName, 'bar'); t.equal(results.functionAlias, 'baz'); t.equal(results.callerName, 'Foo.bar [as baz]'); const callStack2 = ' at bar [as baz] (repl:1:14)\n at ContextifyScript.Script.runInThisContext (vm.js:50:33)\n at REPLServer.defaultEval (repl.js:240:29)\n at bound (domain.js:301:14)\n at REPLServer.runBound [as eval] (domain.js:314:12)\n at REPLServer.onLine (repl.js:468:10)\n at emitOne (events.js:121:20)\n at REPLServer.emit (events.js:211:7)\n at REPLServer.Interface._onLine (readline.js:280:10)\n at REPLServer.Interface._line (readline.js:629:8)'; // eslint-disable-line max-len results = logger.parseCallStack({ stack: callStack2 }, 0); t.ok(results); t.equal(results.className, ''); t.equal(results.functionName, 'bar'); t.equal(results.functionAlias, 'baz'); t.equal(results.callerName, 'bar [as baz]'); const callStack3 = ' at bar (repl:1:14)\n at ContextifyScript.Script.runInThisContext (vm.js:50:33)\n at REPLServer.defaultEval (repl.js:240:29)\n at bound (domain.js:301:14)\n at REPLServer.runBound [as eval] (domain.js:314:12)\n at REPLServer.onLine (repl.js:468:10)\n at emitOne (events.js:121:20)\n at REPLServer.emit (events.js:211:7)\n at REPLServer.Interface._onLine (readline.js:280:10)\n at REPLServer.Interface._line (readline.js:629:8)'; // eslint-disable-line max-len results = logger.parseCallStack({ stack: callStack3 }, 0); t.ok(results); t.equal(results.className, ''); t.equal(results.functionName, 'bar'); t.equal(results.functionAlias, ''); t.equal(results.callerName, 'bar'); const callStack4 = ' at repl:1:14\n at ContextifyScript.Script.runInThisContext (vm.js:50:33)\n at REPLServer.defaultEval (repl.js:240:29)\n at bound (domain.js:301:14)\n at REPLServer.runBound [as eval] (domain.js:314:12)\n at REPLServer.onLine (repl.js:468:10)\n at emitOne (events.js:121:20)\n at REPLServer.emit (events.js:211:7)\n at REPLServer.Interface._onLine (readline.js:280:10)\n at REPLServer.Interface._line (readline.js:629:8)'; // eslint-disable-line max-len results = logger.parseCallStack({ stack: callStack4 }, 0); t.ok(results); t.equal(results.className, ''); t.equal(results.functionName, ''); t.equal(results.functionAlias, ''); t.equal(results.callerName, ''); const callStack5 = ' at Foo.bar (repl:1:14)\n at ContextifyScript.Script.runInThisContext (vm.js:50:33)\n at REPLServer.defaultEval (repl.js:240:29)\n at bound (domain.js:301:14)\n at REPLServer.runBound [as eval] (domain.js:314:12)\n at REPLServer.onLine (repl.js:468:10)\n at emitOne (events.js:121:20)\n at REPLServer.emit (events.js:211:7)\n at REPLServer.Interface._onLine (readline.js:280:10)\n at REPLServer.Interface._line (readline.js:629:8)'; // eslint-disable-line max-len results = logger.parseCallStack({ stack: callStack5 }, 0); t.ok(results); t.equal(results.className, 'Foo'); t.equal(results.functionName, 'bar'); t.equal(results.functionAlias, ''); t.equal(results.callerName, 'Foo.bar'); t.end(); }); batch.test('should correctly change the parseCallStack function', (t) => { const logger = new Logger('stack'); logger.level = 'debug'; logger.useCallStack = true; logger.info('test defaultParseCallStack'); const initialEvent = events.shift(); const parseFunction = function () { return { functionName: 'test function name', fileName: 'test file name', lineNumber: 15, columnNumber: 25, callStack: 'test callstack', }; }; logger.setParseCallStackFunction(parseFunction); t.equal(logger.parseCallStack, parseFunction); logger.info('test parseCallStack'); t.equal(events[0].functionName, 'test function name'); t.equal(events[0].fileName, 'test file name'); t.equal(events[0].lineNumber, 15); t.equal(events[0].columnNumber, 25); t.equal(events[0].callStack, 'test callstack'); events.shift(); logger.setParseCallStackFunction(undefined); logger.info('test restoredDefaultParseCallStack'); t.equal(events[0].functionName, initialEvent.functionName); t.equal(events[0].fileName, initialEvent.fileName); t.equal(events[0].columnNumber, initialEvent.columnNumber); t.throws( () => logger.setParseCallStackFunction('not a function'), 'Invalid type passed to setParseCallStackFunction' ); t.end(); }); batch.test('should correctly change the stack levels to skip', (t) => { const logger = new Logger('stack'); logger.level = 'debug'; logger.useCallStack = true; t.equal( logger.callStackLinesToSkip, 0, 'initial callStackLinesToSkip changed' ); logger.info('get initial stack'); const initialEvent = events.shift(); const newStackSkip = 1; logger.callStackLinesToSkip = newStackSkip; t.equal(logger.callStackLinesToSkip, newStackSkip); logger.info('test stack skip'); const event = events.shift(); t.not(event.functionName, initialEvent.functionName); t.not(event.fileName, initialEvent.fileName); t.equal( event.callStack, initialEvent.callStack.split('\n').slice(newStackSkip).join('\n') ); t.throws(() => { logger.callStackLinesToSkip = -1; }); t.throws(() => { logger.callStackLinesToSkip = '2'; }); t.end(); }); batch.test('should utilize the first Error data value', (t) => { const logger = new Logger('stack'); logger.level = 'debug'; logger.useCallStack = true; const error = new Error(); logger.info(error); const event = events.shift(); t.equal(event.error, error); logger.info(error); t.match(event, events.shift()); logger.callStackLinesToSkip = 1; logger.info(error); const event2 = events.shift(); t.equal(event2.callStack, event.callStack.split('\n').slice(1).join('\n')); logger.callStackLinesToSkip = 0; logger.info('hi', error); const event3 = events.shift(); t.equal(event3.callStack, event.callStack); t.equal(event3.error, error); logger.info('hi', error, new Error()); const event4 = events.shift(); t.equal(event4.callStack, event.callStack); t.equal(event4.error, error); t.end(); }); batch.test('creating/cloning of category', (t) => { const defaultLogger = new Logger('default'); defaultLogger.level = 'trace'; defaultLogger.useCallStack = true; t.test( 'category should be cloned from parent/default if does not exist', (assert) => { const originalLength = categories.size; const logger = new Logger('cheese1'); assert.equal( categories.size, originalLength + 1, 'category should be cloned' ); assert.equal( logger.level, levels.TRACE, 'should inherit level=TRACE from default-category' ); assert.equal( logger.useCallStack, true, 'should inherit useCallStack=true from default-category' ); assert.end(); } ); t.test( 'changing level should not impact default-category or useCallStack', (assert) => { const logger = new Logger('cheese2'); logger.level = 'debug'; assert.equal( logger.level, levels.DEBUG, 'should be changed to level=DEBUG' ); assert.equal( defaultLogger.level, levels.TRACE, 'default-category should remain as level=TRACE' ); assert.equal( logger.useCallStack, true, 'should remain as useCallStack=true' ); assert.equal( defaultLogger.useCallStack, true, 'default-category should remain as useCallStack=true' ); assert.end(); } ); t.test( 'changing useCallStack should not impact default-category or level', (assert) => { const logger = new Logger('cheese3'); logger.useCallStack = false; assert.equal( logger.useCallStack, false, 'should be changed to useCallStack=false' ); assert.equal( defaultLogger.useCallStack, true, 'default-category should remain as useCallStack=true' ); assert.equal( logger.level, levels.TRACE, 'should remain as level=TRACE' ); assert.equal( defaultLogger.level, levels.TRACE, 'default-category should remain as level=TRACE' ); assert.end(); } ); t.end(); }); batch.end(); });
1
log4js-node/log4js-node
1,333
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one
lamweili
"2022-10-01T17:05:08Z"
"2022-10-01T17:20:06Z"
570ef530dc02d3e843a5421cb015bb8fadfe0b41
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one.
./test/tap/connect-logger-test.js
/* eslint max-classes-per-file: ["error", 2] */ const { test } = require('tap'); const EE = require('events').EventEmitter; const levels = require('../../lib/levels'); class MockLogger { constructor() { this.level = levels.TRACE; this.messages = []; this.log = function (level, message) { this.messages.push({ level, message }); }; this.isLevelEnabled = function (level) { return level.isGreaterThanOrEqualTo(this.level); }; } } function MockRequest(remoteAddr, method, originalUrl, headers, url, custom) { this.socket = { remoteAddress: remoteAddr }; this.originalUrl = originalUrl; this.url = url; this.method = method; this.httpVersionMajor = '5'; this.httpVersionMinor = '0'; this.headers = headers || {}; if (custom) { for (const key of Object.keys(custom)) { this[key] = custom[key]; } } const self = this; Object.keys(this.headers).forEach((key) => { self.headers[key.toLowerCase()] = self.headers[key]; }); } class MockResponse extends EE { constructor() { super(); this.cachedHeaders = {}; } end() { this.emit('finish'); } setHeader(key, value) { this.cachedHeaders[key.toLowerCase()] = value; } getHeader(key) { return this.cachedHeaders[key.toLowerCase()]; } writeHead(code /* , headers */) { this.statusCode = code; } } function request( cl, method, originalUrl, code, reqHeaders, resHeaders, next, url, custom = undefined ) { const req = new MockRequest( 'my.remote.addr', method, originalUrl, reqHeaders, url, custom ); const res = new MockResponse(); if (next) { next = next.bind(null, req, res, () => {}); } else { next = () => {}; } cl(req, res, next); res.writeHead(code, resHeaders); res.end('chunk', 'encoding'); } test('log4js connect logger', (batch) => { const clm = require('../../lib/connect-logger'); batch.test('getConnectLoggerModule', (t) => { t.type(clm, 'function', 'should return a connect logger factory'); t.test( 'should take a log4js logger and return a "connect logger"', (assert) => { const ml = new MockLogger(); const cl = clm(ml); assert.type(cl, 'function'); assert.end(); } ); t.test('log events', (assert) => { const ml = new MockLogger(); const cl = clm(ml); request(cl, 'GET', 'http://url', 200); const { messages } = ml; assert.type(messages, 'Array'); assert.equal(messages.length, 1); assert.ok(levels.INFO.isEqualTo(messages[0].level)); assert.match(messages[0].message, 'GET'); assert.match(messages[0].message, 'http://url'); assert.match(messages[0].message, 'my.remote.addr'); assert.match(messages[0].message, '200'); assert.end(); }); t.test('log events with level below logging level', (assert) => { const ml = new MockLogger(); ml.level = levels.FATAL; const cl = clm(ml); request(cl, 'GET', 'http://url', 200); assert.type(ml.messages, 'Array'); assert.equal(ml.messages.length, 0); assert.end(); }); t.test('log events with non-default level and custom format', (assert) => { const ml = new MockLogger(); ml.level = levels.INFO; const cl = clm(ml, { level: levels.WARN, format: ':method :url' }); request(cl, 'GET', 'http://url', 200); const { messages } = ml; assert.type(messages, Array); assert.equal(messages.length, 1); assert.ok(levels.WARN.isEqualTo(messages[0].level)); assert.equal(messages[0].message, 'GET http://url'); assert.end(); }); t.test('adding multiple loggers should only log once', (assert) => { const ml = new MockLogger(); ml.level = levels.INFO; const cl = clm(ml, { level: levels.WARN, format: ':method :url' }); const nextLogger = clm(ml, { level: levels.INFO, format: ':method' }); request(cl, 'GET', 'http://url', 200, null, null, nextLogger); const { messages } = ml; assert.type(messages, Array); assert.equal(messages.length, 1); assert.ok(levels.WARN.isEqualTo(messages[0].level)); assert.equal(messages[0].message, 'GET http://url'); assert.end(); }); t.end(); }); batch.test('logger with options as string', (t) => { const ml = new MockLogger(); ml.level = levels.INFO; const cl = clm(ml, ':method :url'); request(cl, 'POST', 'http://meh', 200); const { messages } = ml; t.equal(messages[0].message, 'POST http://meh'); t.end(); }); batch.test('auto log levels', (t) => { const ml = new MockLogger(); ml.level = levels.INFO; const cl = clm(ml, { level: 'auto', format: ':method :url' }); request(cl, 'GET', 'http://meh', 200); request(cl, 'GET', 'http://meh', 201); request(cl, 'GET', 'http://meh', 302); request(cl, 'GET', 'http://meh', 404); request(cl, 'GET', 'http://meh', 500); const { messages } = ml; t.test('should use INFO for 2xx', (assert) => { assert.ok(levels.INFO.isEqualTo(messages[0].level)); assert.ok(levels.INFO.isEqualTo(messages[1].level)); assert.end(); }); t.test('should use WARN for 3xx', (assert) => { assert.ok(levels.WARN.isEqualTo(messages[2].level)); assert.end(); }); t.test('should use ERROR for 4xx', (assert) => { assert.ok(levels.ERROR.isEqualTo(messages[3].level)); assert.end(); }); t.test('should use ERROR for 5xx', (assert) => { assert.ok(levels.ERROR.isEqualTo(messages[4].level)); assert.end(); }); t.end(); }); batch.test('logger with status code rules applied', (t) => { const ml = new MockLogger(); ml.level = levels.DEBUG; const clr = [ { codes: [201, 304], level: levels.DEBUG.toString() }, { from: 200, to: 299, level: levels.DEBUG.toString() }, { from: 300, to: 399, level: levels.INFO.toString() }, ]; const cl = clm(ml, { level: 'auto', format: ':method :url', statusRules: clr, }); request(cl, 'GET', 'http://meh', 200); request(cl, 'GET', 'http://meh', 201); request(cl, 'GET', 'http://meh', 302); request(cl, 'GET', 'http://meh', 304); request(cl, 'GET', 'http://meh', 404); request(cl, 'GET', 'http://meh', 500); const { messages } = ml; t.test('should use DEBUG for 2xx', (assert) => { assert.ok(levels.DEBUG.isEqualTo(messages[0].level)); assert.ok(levels.DEBUG.isEqualTo(messages[1].level)); assert.end(); }); t.test('should use WARN for 3xx, DEBUG for 304', (assert) => { assert.ok(levels.INFO.isEqualTo(messages[2].level)); assert.ok(levels.DEBUG.isEqualTo(messages[3].level)); assert.end(); }); t.test('should use ERROR for 4xx', (assert) => { assert.ok(levels.ERROR.isEqualTo(messages[4].level)); assert.end(); }); t.test('should use ERROR for 5xx', (assert) => { assert.ok(levels.ERROR.isEqualTo(messages[5].level)); assert.end(); }); t.end(); }); batch.test('format using a function', (t) => { const ml = new MockLogger(); ml.level = levels.INFO; const cl = clm(ml, () => 'I was called'); request(cl, 'GET', 'http://blah', 200); t.equal(ml.messages[0].message, 'I was called'); t.end(); }); batch.test('format using a function that also uses tokens', (t) => { const ml = new MockLogger(); ml.level = levels.INFO; const cl = clm( ml, (req, res, tokenReplacer) => `${req.method} ${tokenReplacer(':status')}` ); request(cl, 'GET', 'http://blah', 200); t.equal(ml.messages[0].message, 'GET 200'); t.end(); }); batch.test( 'format using a function, but do not log anything if the function returns nothing', (t) => { const ml = new MockLogger(); ml.level = levels.INFO; const cl = clm(ml, () => null); request(cl, 'GET', 'http://blah', 200); t.equal(ml.messages.length, 0); t.end(); } ); batch.test('format that includes request headers', (t) => { const ml = new MockLogger(); ml.level = levels.INFO; const cl = clm(ml, ':req[Content-Type]'); request(cl, 'GET', 'http://blah', 200, { 'Content-Type': 'application/json', }); t.equal(ml.messages[0].message, 'application/json'); t.end(); }); batch.test('format that includes response headers', (t) => { const ml = new MockLogger(); ml.level = levels.INFO; const cl = clm(ml, ':res[Content-Type]'); request(cl, 'GET', 'http://blah', 200, null, { 'Content-Type': 'application/cheese', }); t.equal(ml.messages[0].message, 'application/cheese'); t.end(); }); batch.test('url token should check originalUrl and url', (t) => { const ml = new MockLogger(); const cl = clm(ml, ':url'); request(cl, 'GET', null, 200, null, null, null, 'http://cheese'); t.equal(ml.messages[0].message, 'http://cheese'); t.end(); }); batch.test('log events with custom token', (t) => { const ml = new MockLogger(); ml.level = levels.INFO; const cl = clm(ml, { level: levels.INFO, format: ':method :url :custom_string', tokens: [ { token: ':custom_string', replacement: 'fooBAR', }, ], }); request(cl, 'GET', 'http://url', 200); t.type(ml.messages, 'Array'); t.equal(ml.messages.length, 1); t.ok(levels.INFO.isEqualTo(ml.messages[0].level)); t.equal(ml.messages[0].message, 'GET http://url fooBAR'); t.end(); }); batch.test('log events with custom override token', (t) => { const ml = new MockLogger(); ml.level = levels.INFO; const cl = clm(ml, { level: levels.INFO, format: ':method :url :date', tokens: [ { token: ':date', replacement: '20150310', }, ], }); request(cl, 'GET', 'http://url', 200); t.type(ml.messages, 'Array'); t.equal(ml.messages.length, 1); t.ok(levels.INFO.isEqualTo(ml.messages[0].level)); t.equal(ml.messages[0].message, 'GET http://url 20150310'); t.end(); }); batch.test('log events with custom format', (t) => { const ml = new MockLogger(); const body = { say: 'hi!' }; ml.level = levels.INFO; const cl = clm(ml, { level: levels.INFO, format: (req, res, format) => format(`:method :url ${JSON.stringify(req.body)}`), }); request( cl, 'POST', 'http://url', 200, { 'Content-Type': 'application/json' }, null, null, null, { body } ); t.ok(levels.INFO.isEqualTo(ml.messages[0].level)); t.equal(ml.messages[0].message, `POST http://url ${JSON.stringify(body)}`); t.end(); }); batch.test( 'handle weird old node versions where socket contains socket', (t) => { const ml = new MockLogger(); const cl = clm(ml, ':remote-addr'); const req = new MockRequest(null, 'GET', 'http://blah'); req.socket = { socket: { remoteAddress: 'this is weird' } }; const res = new MockResponse(); cl(req, res, () => {}); res.writeHead(200, {}); res.end('chunk', 'encoding'); t.equal(ml.messages[0].message, 'this is weird'); t.end(); } ); batch.test( 'handles as soon as any of the events end/finish/error/close triggers (only once)', (t) => { const ml = new MockLogger(); const cl = clm(ml, ':remote-addr'); const req = new MockRequest(null, 'GET', 'http://blah'); req.socket = { socket: { remoteAddress: 'this is weird' } }; const res = new MockResponse(); cl(req, res, () => {}); res.writeHead(200, {}); t.equal(ml.messages.length, 0); res.emit('end'); res.emit('finish'); res.emit('error'); res.emit('close'); t.equal(ml.messages.length, 1); t.equal(ml.messages[0].message, 'this is weird'); t.end(); } ); batch.end(); });
/* eslint max-classes-per-file: ["error", 2] */ const { test } = require('tap'); const EE = require('events').EventEmitter; const levels = require('../../lib/levels'); class MockLogger { constructor() { this.level = levels.TRACE; this.messages = []; this.log = function (level, message) { this.messages.push({ level, message }); }; this.isLevelEnabled = function (level) { return level.isGreaterThanOrEqualTo(this.level); }; } } function MockRequest(remoteAddr, method, originalUrl, headers, url, custom) { this.socket = { remoteAddress: remoteAddr }; this.originalUrl = originalUrl; this.url = url; this.method = method; this.httpVersionMajor = '5'; this.httpVersionMinor = '0'; this.headers = headers || {}; if (custom) { for (const key of Object.keys(custom)) { this[key] = custom[key]; } } const self = this; Object.keys(this.headers).forEach((key) => { self.headers[key.toLowerCase()] = self.headers[key]; }); } class MockResponse extends EE { constructor() { super(); this.cachedHeaders = {}; } end() { this.emit('finish'); } setHeader(key, value) { this.cachedHeaders[key.toLowerCase()] = value; } getHeader(key) { return this.cachedHeaders[key.toLowerCase()]; } writeHead(code /* , headers */) { this.statusCode = code; } } function request( cl, method, originalUrl, code, reqHeaders, resHeaders, next, url, custom = undefined ) { const req = new MockRequest( 'my.remote.addr', method, originalUrl, reqHeaders, url, custom ); const res = new MockResponse(); if (next) { next = next.bind(null, req, res, () => {}); } else { next = () => {}; } cl(req, res, next); res.writeHead(code, resHeaders); res.end('chunk', 'encoding'); } test('log4js connect logger', (batch) => { const clm = require('../../lib/connect-logger'); batch.test('getConnectLoggerModule', (t) => { t.type(clm, 'function', 'should return a connect logger factory'); t.test( 'should take a log4js logger and return a "connect logger"', (assert) => { const ml = new MockLogger(); const cl = clm(ml); assert.type(cl, 'function'); assert.end(); } ); t.test('log events', (assert) => { const ml = new MockLogger(); const cl = clm(ml); request(cl, 'GET', 'http://url', 200); const { messages } = ml; assert.type(messages, 'Array'); assert.equal(messages.length, 1); assert.ok(levels.INFO.isEqualTo(messages[0].level)); assert.match(messages[0].message, 'GET'); assert.match(messages[0].message, 'http://url'); assert.match(messages[0].message, 'my.remote.addr'); assert.match(messages[0].message, '200'); assert.end(); }); t.test('log events with level below logging level', (assert) => { const ml = new MockLogger(); ml.level = levels.FATAL; const cl = clm(ml); request(cl, 'GET', 'http://url', 200); assert.type(ml.messages, 'Array'); assert.equal(ml.messages.length, 0); assert.end(); }); t.test('log events with non-default level and custom format', (assert) => { const ml = new MockLogger(); ml.level = levels.INFO; const cl = clm(ml, { level: levels.WARN, format: ':method :url' }); request(cl, 'GET', 'http://url', 200); const { messages } = ml; assert.type(messages, Array); assert.equal(messages.length, 1); assert.ok(levels.WARN.isEqualTo(messages[0].level)); assert.equal(messages[0].message, 'GET http://url'); assert.end(); }); t.test('adding multiple loggers should only log once', (assert) => { const ml = new MockLogger(); ml.level = levels.INFO; const cl = clm(ml, { level: levels.WARN, format: ':method :url' }); const nextLogger = clm(ml, { level: levels.INFO, format: ':method' }); request(cl, 'GET', 'http://url', 200, null, null, nextLogger); const { messages } = ml; assert.type(messages, Array); assert.equal(messages.length, 1); assert.ok(levels.WARN.isEqualTo(messages[0].level)); assert.equal(messages[0].message, 'GET http://url'); assert.end(); }); t.end(); }); batch.test('logger with options as string', (t) => { const ml = new MockLogger(); ml.level = levels.INFO; const cl = clm(ml, ':method :url'); request(cl, 'POST', 'http://meh', 200); const { messages } = ml; t.equal(messages[0].message, 'POST http://meh'); t.end(); }); batch.test('auto log levels', (t) => { const ml = new MockLogger(); ml.level = levels.INFO; const cl = clm(ml, { level: 'auto', format: ':method :url' }); request(cl, 'GET', 'http://meh', 200); request(cl, 'GET', 'http://meh', 201); request(cl, 'GET', 'http://meh', 302); request(cl, 'GET', 'http://meh', 404); request(cl, 'GET', 'http://meh', 500); const { messages } = ml; t.test('should use INFO for 2xx', (assert) => { assert.ok(levels.INFO.isEqualTo(messages[0].level)); assert.ok(levels.INFO.isEqualTo(messages[1].level)); assert.end(); }); t.test('should use WARN for 3xx', (assert) => { assert.ok(levels.WARN.isEqualTo(messages[2].level)); assert.end(); }); t.test('should use ERROR for 4xx', (assert) => { assert.ok(levels.ERROR.isEqualTo(messages[3].level)); assert.end(); }); t.test('should use ERROR for 5xx', (assert) => { assert.ok(levels.ERROR.isEqualTo(messages[4].level)); assert.end(); }); t.end(); }); batch.test('logger with status code rules applied', (t) => { const ml = new MockLogger(); ml.level = levels.DEBUG; const clr = [ { codes: [201, 304], level: levels.DEBUG.toString() }, { from: 200, to: 299, level: levels.DEBUG.toString() }, { from: 300, to: 399, level: levels.INFO.toString() }, ]; const cl = clm(ml, { level: 'auto', format: ':method :url', statusRules: clr, }); request(cl, 'GET', 'http://meh', 200); request(cl, 'GET', 'http://meh', 201); request(cl, 'GET', 'http://meh', 302); request(cl, 'GET', 'http://meh', 304); request(cl, 'GET', 'http://meh', 404); request(cl, 'GET', 'http://meh', 500); const { messages } = ml; t.test('should use DEBUG for 2xx', (assert) => { assert.ok(levels.DEBUG.isEqualTo(messages[0].level)); assert.ok(levels.DEBUG.isEqualTo(messages[1].level)); assert.end(); }); t.test('should use WARN for 3xx, DEBUG for 304', (assert) => { assert.ok(levels.INFO.isEqualTo(messages[2].level)); assert.ok(levels.DEBUG.isEqualTo(messages[3].level)); assert.end(); }); t.test('should use ERROR for 4xx', (assert) => { assert.ok(levels.ERROR.isEqualTo(messages[4].level)); assert.end(); }); t.test('should use ERROR for 5xx', (assert) => { assert.ok(levels.ERROR.isEqualTo(messages[5].level)); assert.end(); }); t.end(); }); batch.test('format using a function', (t) => { const ml = new MockLogger(); ml.level = levels.INFO; const cl = clm(ml, () => 'I was called'); request(cl, 'GET', 'http://blah', 200); t.equal(ml.messages[0].message, 'I was called'); t.end(); }); batch.test('format using a function that also uses tokens', (t) => { const ml = new MockLogger(); ml.level = levels.INFO; const cl = clm( ml, (req, res, tokenReplacer) => `${req.method} ${tokenReplacer(':status')}` ); request(cl, 'GET', 'http://blah', 200); t.equal(ml.messages[0].message, 'GET 200'); t.end(); }); batch.test( 'format using a function, but do not log anything if the function returns nothing', (t) => { const ml = new MockLogger(); ml.level = levels.INFO; const cl = clm(ml, () => null); request(cl, 'GET', 'http://blah', 200); t.equal(ml.messages.length, 0); t.end(); } ); batch.test('format that includes request headers', (t) => { const ml = new MockLogger(); ml.level = levels.INFO; const cl = clm(ml, ':req[Content-Type]'); request(cl, 'GET', 'http://blah', 200, { 'Content-Type': 'application/json', }); t.equal(ml.messages[0].message, 'application/json'); t.end(); }); batch.test('format that includes response headers', (t) => { const ml = new MockLogger(); ml.level = levels.INFO; const cl = clm(ml, ':res[Content-Type]'); request(cl, 'GET', 'http://blah', 200, null, { 'Content-Type': 'application/cheese', }); t.equal(ml.messages[0].message, 'application/cheese'); t.end(); }); batch.test('url token should check originalUrl and url', (t) => { const ml = new MockLogger(); const cl = clm(ml, ':url'); request(cl, 'GET', null, 200, null, null, null, 'http://cheese'); t.equal(ml.messages[0].message, 'http://cheese'); t.end(); }); batch.test('log events with custom token', (t) => { const ml = new MockLogger(); ml.level = levels.INFO; const cl = clm(ml, { level: levels.INFO, format: ':method :url :custom_string', tokens: [ { token: ':custom_string', replacement: 'fooBAR', }, ], }); request(cl, 'GET', 'http://url', 200); t.type(ml.messages, 'Array'); t.equal(ml.messages.length, 1); t.ok(levels.INFO.isEqualTo(ml.messages[0].level)); t.equal(ml.messages[0].message, 'GET http://url fooBAR'); t.end(); }); batch.test('log events with custom override token', (t) => { const ml = new MockLogger(); ml.level = levels.INFO; const cl = clm(ml, { level: levels.INFO, format: ':method :url :date', tokens: [ { token: ':date', replacement: '20150310', }, ], }); request(cl, 'GET', 'http://url', 200); t.type(ml.messages, 'Array'); t.equal(ml.messages.length, 1); t.ok(levels.INFO.isEqualTo(ml.messages[0].level)); t.equal(ml.messages[0].message, 'GET http://url 20150310'); t.end(); }); batch.test('log events with custom format', (t) => { const ml = new MockLogger(); const body = { say: 'hi!' }; ml.level = levels.INFO; const cl = clm(ml, { level: levels.INFO, format: (req, res, format) => format(`:method :url ${JSON.stringify(req.body)}`), }); request( cl, 'POST', 'http://url', 200, { 'Content-Type': 'application/json' }, null, null, null, { body } ); t.ok(levels.INFO.isEqualTo(ml.messages[0].level)); t.equal(ml.messages[0].message, `POST http://url ${JSON.stringify(body)}`); t.end(); }); batch.test( 'handle weird old node versions where socket contains socket', (t) => { const ml = new MockLogger(); const cl = clm(ml, ':remote-addr'); const req = new MockRequest(null, 'GET', 'http://blah'); req.socket = { socket: { remoteAddress: 'this is weird' } }; const res = new MockResponse(); cl(req, res, () => {}); res.writeHead(200, {}); res.end('chunk', 'encoding'); t.equal(ml.messages[0].message, 'this is weird'); t.end(); } ); batch.test( 'handles as soon as any of the events end/finish/error/close triggers (only once)', (t) => { const ml = new MockLogger(); const cl = clm(ml, ':remote-addr'); const req = new MockRequest(null, 'GET', 'http://blah'); req.socket = { socket: { remoteAddress: 'this is weird' } }; const res = new MockResponse(); cl(req, res, () => {}); res.writeHead(200, {}); t.equal(ml.messages.length, 0); res.emit('end'); res.emit('finish'); res.emit('error'); res.emit('close'); t.equal(ml.messages.length, 1); t.equal(ml.messages[0].message, 'this is weird'); t.end(); } ); batch.end(); });
-1
log4js-node/log4js-node
1,333
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one
lamweili
"2022-10-01T17:05:08Z"
"2022-10-01T17:20:06Z"
570ef530dc02d3e843a5421cb015bb8fadfe0b41
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one.
./lib/appenders/categoryFilter.js
const debug = require('debug')('log4js:categoryFilter'); function categoryFilter(excludes, appender) { if (typeof excludes === 'string') excludes = [excludes]; return (logEvent) => { debug(`Checking ${logEvent.categoryName} against ${excludes}`); if (excludes.indexOf(logEvent.categoryName) === -1) { debug('Not excluded, sending to appender'); appender(logEvent); } }; } function configure(config, layouts, findAppender) { const appender = findAppender(config.appender); return categoryFilter(config.exclude, appender); } module.exports.configure = configure;
const debug = require('debug')('log4js:categoryFilter'); function categoryFilter(excludes, appender) { if (typeof excludes === 'string') excludes = [excludes]; return (logEvent) => { debug(`Checking ${logEvent.categoryName} against ${excludes}`); if (excludes.indexOf(logEvent.categoryName) === -1) { debug('Not excluded, sending to appender'); appender(logEvent); } }; } function configure(config, layouts, findAppender) { const appender = findAppender(config.appender); return categoryFilter(config.exclude, appender); } module.exports.configure = configure;
-1
log4js-node/log4js-node
1,333
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one
lamweili
"2022-10-01T17:05:08Z"
"2022-10-01T17:20:06Z"
570ef530dc02d3e843a5421cb015bb8fadfe0b41
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one.
./test/tap/configuration-inheritance-test.js
const { test } = require('tap'); const log4js = require('../../lib/log4js'); const categories = require('../../lib/categories'); test('log4js category inherit all appenders from direct parent', (batch) => { batch.test('should inherit appenders from direct parent', (t) => { const config = { appenders: { stdout1: { type: 'dummy-appender', label: 'stdout1' }, stdout2: { type: 'dummy-appender', label: 'stdout2' }, }, categories: { default: { appenders: ['stdout1'], level: 'ERROR' }, catA: { appenders: ['stdout1', 'stdout2'], level: 'INFO' }, 'catA.catB': { level: 'DEBUG' }, }, }; log4js.configure(config); const childCategoryName = 'catA.catB'; const childAppenders = categories.appendersForCategory(childCategoryName); const childLevel = categories.getLevelForCategory(childCategoryName); t.ok(childAppenders); t.equal(childAppenders.length, 2, 'inherited 2 appenders'); t.ok( childAppenders.some((a) => a.label === 'stdout1'), 'inherited stdout1' ); t.ok( childAppenders.some((a) => a.label === 'stdout2'), 'inherited stdout2' ); t.equal(childLevel.levelStr, 'DEBUG', 'child level overrides parent'); t.end(); }); batch.test( 'multiple children should inherit config from shared parent', (t) => { const config = { appenders: { stdout1: { type: 'dummy-appender', label: 'stdout1' }, stdout2: { type: 'dummy-appender', label: 'stdout2' }, }, categories: { default: { appenders: ['stdout1'], level: 'ERROR' }, catA: { appenders: ['stdout1'], level: 'INFO' }, 'catA.catB.cat1': { level: 'DEBUG' }, // should get sdtout1, DEBUG 'catA.catB.cat2': { appenders: ['stdout2'] }, // should get sdtout1,sdtout2, INFO }, }; log4js.configure(config); const child1CategoryName = 'catA.catB.cat1'; const child1Appenders = categories.appendersForCategory(child1CategoryName); const child1Level = categories.getLevelForCategory(child1CategoryName); t.equal(child1Appenders.length, 1, 'inherited 1 appender'); t.ok( child1Appenders.some((a) => a.label === 'stdout1'), 'inherited stdout1' ); t.equal(child1Level.levelStr, 'DEBUG', 'child level overrides parent'); const child2CategoryName = 'catA.catB.cat2'; const child2Appenders = categories.appendersForCategory(child2CategoryName); const child2Level = categories.getLevelForCategory(child2CategoryName); t.ok(child2Appenders); t.equal( child2Appenders.length, 2, 'inherited 1 appenders, plus its original' ); t.ok( child2Appenders.some((a) => a.label === 'stdout1'), 'inherited stdout1' ); t.ok( child2Appenders.some((a) => a.label === 'stdout2'), 'kept stdout2' ); t.equal(child2Level.levelStr, 'INFO', 'inherited parent level'); t.end(); } ); batch.test('should inherit appenders from multiple parents', (t) => { const config = { appenders: { stdout1: { type: 'dummy-appender', label: 'stdout1' }, stdout2: { type: 'dummy-appender', label: 'stdout2' }, }, categories: { default: { appenders: ['stdout1'], level: 'ERROR' }, catA: { appenders: ['stdout1'], level: 'INFO' }, 'catA.catB': { appenders: ['stdout2'], level: 'INFO' }, // should get stdout1 and stdout2 'catA.catB.catC': { level: 'DEBUG' }, // should get stdout1 and stdout2 }, }; log4js.configure(config); const childCategoryName = 'catA.catB.catC'; const childAppenders = categories.appendersForCategory(childCategoryName); t.ok(childAppenders); t.equal(childAppenders.length, 2, 'inherited 2 appenders'); t.ok( childAppenders.some((a) => a.label === 'stdout1'), 'inherited stdout1' ); t.ok( childAppenders.some((a) => a.label === 'stdout1'), 'inherited stdout1' ); const firstParentName = 'catA.catB'; const firstParentAppenders = categories.appendersForCategory(firstParentName); t.ok(firstParentAppenders); t.equal(firstParentAppenders.length, 2, 'ended up with 2 appenders'); t.ok( firstParentAppenders.some((a) => a.label === 'stdout1'), 'inherited stdout1' ); t.ok( firstParentAppenders.some((a) => a.label === 'stdout2'), 'kept stdout2' ); t.end(); }); batch.test( 'should inherit appenders from deep parent with missing direct parent', (t) => { const config = { appenders: { stdout1: { type: 'dummy-appender', label: 'stdout1' }, stdout2: { type: 'dummy-appender', label: 'stdout2' }, }, categories: { default: { appenders: ['stdout1'], level: 'ERROR' }, catA: { appenders: ['stdout1'], level: 'INFO' }, // no catA.catB, but should get created, with stdout1 'catA.catB.catC': { level: 'DEBUG' }, // should get stdout1 }, }; log4js.configure(config); const childCategoryName = 'catA.catB.catC'; const childAppenders = categories.appendersForCategory(childCategoryName); t.ok(childAppenders); t.equal(childAppenders.length, 1, 'inherited 1 appenders'); t.ok( childAppenders.some((a) => a.label === 'stdout1'), 'inherited stdout1' ); const firstParentCategoryName = 'catA.catB'; const firstParentAppenders = categories.appendersForCategory( firstParentCategoryName ); t.ok(firstParentAppenders, 'catA.catB got created implicitily'); t.equal( firstParentAppenders.length, 1, 'created with 1 inherited appender' ); t.ok( firstParentAppenders.some((a) => a.label === 'stdout1'), 'inherited stdout1' ); t.end(); } ); batch.test('should deal gracefully with missing parent', (t) => { const config = { appenders: { stdout1: { type: 'dummy-appender', label: 'stdout1' }, stdout2: { type: 'dummy-appender', label: 'stdout2' }, }, categories: { default: { appenders: ['stdout1'], level: 'ERROR' }, // no catA nor catA.catB, but should get created, with default values 'catA.catB.catC': { appenders: ['stdout2'], level: 'DEBUG' }, // should get stdout2, DEBUG }, }; log4js.configure(config); const childCategoryName = 'catA.catB.catC'; const childAppenders = categories.appendersForCategory(childCategoryName); t.ok(childAppenders); t.equal(childAppenders.length, 1); t.ok(childAppenders.some((a) => a.label === 'stdout2')); t.end(); }); batch.test( 'should not get duplicate appenders if parent has the same one', (t) => { const config = { appenders: { stdout1: { type: 'dummy-appender', label: 'stdout1' }, stdout2: { type: 'dummy-appender', label: 'stdout2' }, }, categories: { default: { appenders: ['stdout1'], level: 'ERROR' }, catA: { appenders: ['stdout1', 'stdout2'], level: 'INFO' }, 'catA.catB': { appenders: ['stdout1'], level: 'DEBUG' }, }, }; log4js.configure(config); const childCategoryName = 'catA.catB'; const childAppenders = categories.appendersForCategory(childCategoryName); t.ok(childAppenders); t.equal(childAppenders.length, 2, 'inherited 1 appender'); t.ok( childAppenders.some((a) => a.label === 'stdout1'), 'still have stdout1' ); t.ok( childAppenders.some((a) => a.label === 'stdout2'), 'inherited stdout2' ); t.end(); } ); batch.test('inherit:falses should disable inheritance', (t) => { const config = { appenders: { stdout1: { type: 'dummy-appender', label: 'stdout1' }, stdout2: { type: 'dummy-appender', label: 'stdout2' }, }, categories: { default: { appenders: ['stdout1'], level: 'ERROR' }, catA: { appenders: ['stdout1'], level: 'INFO' }, 'catA.catB': { appenders: ['stdout2'], level: 'INFO', inherit: false }, // should not inherit from catA }, }; log4js.configure(config); const childCategoryName = 'catA.catB'; const childAppenders = categories.appendersForCategory(childCategoryName); t.ok(childAppenders); t.equal(childAppenders.length, 1, 'inherited no appender'); t.ok( childAppenders.some((a) => a.label === 'stdout2'), 'kept stdout2' ); t.end(); }); batch.test( 'inheritance should stop if direct parent has inherit off', (t) => { const config = { appenders: { stdout1: { type: 'dummy-appender', label: 'stdout1' }, stdout2: { type: 'dummy-appender', label: 'stdout2' }, }, categories: { default: { appenders: ['stdout1'], level: 'ERROR' }, catA: { appenders: ['stdout1'], level: 'INFO' }, 'catA.catB': { appenders: ['stdout2'], level: 'INFO', inherit: false, }, // should not inherit from catA 'catA.catB.catC': { level: 'DEBUG' }, // should inherit from catB only }, }; log4js.configure(config); const childCategoryName = 'catA.catB.catC'; const childAppenders = categories.appendersForCategory(childCategoryName); t.ok(childAppenders); t.equal(childAppenders.length, 1, 'inherited 1 appender'); t.ok( childAppenders.some((a) => a.label === 'stdout2'), 'inherited stdout2' ); const firstParentCategoryName = 'catA.catB'; const firstParentAppenders = categories.appendersForCategory( firstParentCategoryName ); t.ok(firstParentAppenders); t.equal(firstParentAppenders.length, 1, 'did not inherit new appenders'); t.ok( firstParentAppenders.some((a) => a.label === 'stdout2'), 'kept stdout2' ); t.end(); } ); batch.test('should inherit level when it is missing', (t) => { const config = { appenders: { stdout1: { type: 'dummy-appender', label: 'stdout1' }, stdout2: { type: 'dummy-appender', label: 'stdout2' }, }, categories: { default: { appenders: ['stdout1'], level: 'ERROR' }, catA: { appenders: ['stdout1'], level: 'INFO' }, // no catA.catB, but should get created, with stdout1, level INFO 'catA.catB.catC': {}, // should get stdout1, level INFO }, }; log4js.configure(config); const childCategoryName = 'catA.catB.catC'; const childLevel = categories.getLevelForCategory(childCategoryName); t.equal(childLevel.levelStr, 'INFO', 'inherited level'); const firstParentCategoryName = 'catA.catB'; const firstParentLevel = categories.getLevelForCategory( firstParentCategoryName ); t.equal( firstParentLevel.levelStr, 'INFO', 'generate parent inherited level from base' ); t.end(); }); batch.end(); });
const { test } = require('tap'); const log4js = require('../../lib/log4js'); const categories = require('../../lib/categories'); test('log4js category inherit all appenders from direct parent', (batch) => { batch.test('should inherit appenders from direct parent', (t) => { const config = { appenders: { stdout1: { type: 'dummy-appender', label: 'stdout1' }, stdout2: { type: 'dummy-appender', label: 'stdout2' }, }, categories: { default: { appenders: ['stdout1'], level: 'ERROR' }, catA: { appenders: ['stdout1', 'stdout2'], level: 'INFO' }, 'catA.catB': { level: 'DEBUG' }, }, }; log4js.configure(config); const childCategoryName = 'catA.catB'; const childAppenders = categories.appendersForCategory(childCategoryName); const childLevel = categories.getLevelForCategory(childCategoryName); t.ok(childAppenders); t.equal(childAppenders.length, 2, 'inherited 2 appenders'); t.ok( childAppenders.some((a) => a.label === 'stdout1'), 'inherited stdout1' ); t.ok( childAppenders.some((a) => a.label === 'stdout2'), 'inherited stdout2' ); t.equal(childLevel.levelStr, 'DEBUG', 'child level overrides parent'); t.end(); }); batch.test( 'multiple children should inherit config from shared parent', (t) => { const config = { appenders: { stdout1: { type: 'dummy-appender', label: 'stdout1' }, stdout2: { type: 'dummy-appender', label: 'stdout2' }, }, categories: { default: { appenders: ['stdout1'], level: 'ERROR' }, catA: { appenders: ['stdout1'], level: 'INFO' }, 'catA.catB.cat1': { level: 'DEBUG' }, // should get sdtout1, DEBUG 'catA.catB.cat2': { appenders: ['stdout2'] }, // should get sdtout1,sdtout2, INFO }, }; log4js.configure(config); const child1CategoryName = 'catA.catB.cat1'; const child1Appenders = categories.appendersForCategory(child1CategoryName); const child1Level = categories.getLevelForCategory(child1CategoryName); t.equal(child1Appenders.length, 1, 'inherited 1 appender'); t.ok( child1Appenders.some((a) => a.label === 'stdout1'), 'inherited stdout1' ); t.equal(child1Level.levelStr, 'DEBUG', 'child level overrides parent'); const child2CategoryName = 'catA.catB.cat2'; const child2Appenders = categories.appendersForCategory(child2CategoryName); const child2Level = categories.getLevelForCategory(child2CategoryName); t.ok(child2Appenders); t.equal( child2Appenders.length, 2, 'inherited 1 appenders, plus its original' ); t.ok( child2Appenders.some((a) => a.label === 'stdout1'), 'inherited stdout1' ); t.ok( child2Appenders.some((a) => a.label === 'stdout2'), 'kept stdout2' ); t.equal(child2Level.levelStr, 'INFO', 'inherited parent level'); t.end(); } ); batch.test('should inherit appenders from multiple parents', (t) => { const config = { appenders: { stdout1: { type: 'dummy-appender', label: 'stdout1' }, stdout2: { type: 'dummy-appender', label: 'stdout2' }, }, categories: { default: { appenders: ['stdout1'], level: 'ERROR' }, catA: { appenders: ['stdout1'], level: 'INFO' }, 'catA.catB': { appenders: ['stdout2'], level: 'INFO' }, // should get stdout1 and stdout2 'catA.catB.catC': { level: 'DEBUG' }, // should get stdout1 and stdout2 }, }; log4js.configure(config); const childCategoryName = 'catA.catB.catC'; const childAppenders = categories.appendersForCategory(childCategoryName); t.ok(childAppenders); t.equal(childAppenders.length, 2, 'inherited 2 appenders'); t.ok( childAppenders.some((a) => a.label === 'stdout1'), 'inherited stdout1' ); t.ok( childAppenders.some((a) => a.label === 'stdout1'), 'inherited stdout1' ); const firstParentName = 'catA.catB'; const firstParentAppenders = categories.appendersForCategory(firstParentName); t.ok(firstParentAppenders); t.equal(firstParentAppenders.length, 2, 'ended up with 2 appenders'); t.ok( firstParentAppenders.some((a) => a.label === 'stdout1'), 'inherited stdout1' ); t.ok( firstParentAppenders.some((a) => a.label === 'stdout2'), 'kept stdout2' ); t.end(); }); batch.test( 'should inherit appenders from deep parent with missing direct parent', (t) => { const config = { appenders: { stdout1: { type: 'dummy-appender', label: 'stdout1' }, stdout2: { type: 'dummy-appender', label: 'stdout2' }, }, categories: { default: { appenders: ['stdout1'], level: 'ERROR' }, catA: { appenders: ['stdout1'], level: 'INFO' }, // no catA.catB, but should get created, with stdout1 'catA.catB.catC': { level: 'DEBUG' }, // should get stdout1 }, }; log4js.configure(config); const childCategoryName = 'catA.catB.catC'; const childAppenders = categories.appendersForCategory(childCategoryName); t.ok(childAppenders); t.equal(childAppenders.length, 1, 'inherited 1 appenders'); t.ok( childAppenders.some((a) => a.label === 'stdout1'), 'inherited stdout1' ); const firstParentCategoryName = 'catA.catB'; const firstParentAppenders = categories.appendersForCategory( firstParentCategoryName ); t.ok(firstParentAppenders, 'catA.catB got created implicitily'); t.equal( firstParentAppenders.length, 1, 'created with 1 inherited appender' ); t.ok( firstParentAppenders.some((a) => a.label === 'stdout1'), 'inherited stdout1' ); t.end(); } ); batch.test('should deal gracefully with missing parent', (t) => { const config = { appenders: { stdout1: { type: 'dummy-appender', label: 'stdout1' }, stdout2: { type: 'dummy-appender', label: 'stdout2' }, }, categories: { default: { appenders: ['stdout1'], level: 'ERROR' }, // no catA nor catA.catB, but should get created, with default values 'catA.catB.catC': { appenders: ['stdout2'], level: 'DEBUG' }, // should get stdout2, DEBUG }, }; log4js.configure(config); const childCategoryName = 'catA.catB.catC'; const childAppenders = categories.appendersForCategory(childCategoryName); t.ok(childAppenders); t.equal(childAppenders.length, 1); t.ok(childAppenders.some((a) => a.label === 'stdout2')); t.end(); }); batch.test( 'should not get duplicate appenders if parent has the same one', (t) => { const config = { appenders: { stdout1: { type: 'dummy-appender', label: 'stdout1' }, stdout2: { type: 'dummy-appender', label: 'stdout2' }, }, categories: { default: { appenders: ['stdout1'], level: 'ERROR' }, catA: { appenders: ['stdout1', 'stdout2'], level: 'INFO' }, 'catA.catB': { appenders: ['stdout1'], level: 'DEBUG' }, }, }; log4js.configure(config); const childCategoryName = 'catA.catB'; const childAppenders = categories.appendersForCategory(childCategoryName); t.ok(childAppenders); t.equal(childAppenders.length, 2, 'inherited 1 appender'); t.ok( childAppenders.some((a) => a.label === 'stdout1'), 'still have stdout1' ); t.ok( childAppenders.some((a) => a.label === 'stdout2'), 'inherited stdout2' ); t.end(); } ); batch.test('inherit:falses should disable inheritance', (t) => { const config = { appenders: { stdout1: { type: 'dummy-appender', label: 'stdout1' }, stdout2: { type: 'dummy-appender', label: 'stdout2' }, }, categories: { default: { appenders: ['stdout1'], level: 'ERROR' }, catA: { appenders: ['stdout1'], level: 'INFO' }, 'catA.catB': { appenders: ['stdout2'], level: 'INFO', inherit: false }, // should not inherit from catA }, }; log4js.configure(config); const childCategoryName = 'catA.catB'; const childAppenders = categories.appendersForCategory(childCategoryName); t.ok(childAppenders); t.equal(childAppenders.length, 1, 'inherited no appender'); t.ok( childAppenders.some((a) => a.label === 'stdout2'), 'kept stdout2' ); t.end(); }); batch.test( 'inheritance should stop if direct parent has inherit off', (t) => { const config = { appenders: { stdout1: { type: 'dummy-appender', label: 'stdout1' }, stdout2: { type: 'dummy-appender', label: 'stdout2' }, }, categories: { default: { appenders: ['stdout1'], level: 'ERROR' }, catA: { appenders: ['stdout1'], level: 'INFO' }, 'catA.catB': { appenders: ['stdout2'], level: 'INFO', inherit: false, }, // should not inherit from catA 'catA.catB.catC': { level: 'DEBUG' }, // should inherit from catB only }, }; log4js.configure(config); const childCategoryName = 'catA.catB.catC'; const childAppenders = categories.appendersForCategory(childCategoryName); t.ok(childAppenders); t.equal(childAppenders.length, 1, 'inherited 1 appender'); t.ok( childAppenders.some((a) => a.label === 'stdout2'), 'inherited stdout2' ); const firstParentCategoryName = 'catA.catB'; const firstParentAppenders = categories.appendersForCategory( firstParentCategoryName ); t.ok(firstParentAppenders); t.equal(firstParentAppenders.length, 1, 'did not inherit new appenders'); t.ok( firstParentAppenders.some((a) => a.label === 'stdout2'), 'kept stdout2' ); t.end(); } ); batch.test('should inherit level when it is missing', (t) => { const config = { appenders: { stdout1: { type: 'dummy-appender', label: 'stdout1' }, stdout2: { type: 'dummy-appender', label: 'stdout2' }, }, categories: { default: { appenders: ['stdout1'], level: 'ERROR' }, catA: { appenders: ['stdout1'], level: 'INFO' }, // no catA.catB, but should get created, with stdout1, level INFO 'catA.catB.catC': {}, // should get stdout1, level INFO }, }; log4js.configure(config); const childCategoryName = 'catA.catB.catC'; const childLevel = categories.getLevelForCategory(childCategoryName); t.equal(childLevel.levelStr, 'INFO', 'inherited level'); const firstParentCategoryName = 'catA.catB'; const firstParentLevel = categories.getLevelForCategory( firstParentCategoryName ); t.equal( firstParentLevel.levelStr, 'INFO', 'generate parent inherited level from base' ); t.end(); }); batch.end(); });
-1
log4js-node/log4js-node
1,333
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one
lamweili
"2022-10-01T17:05:08Z"
"2022-10-01T17:20:06Z"
570ef530dc02d3e843a5421cb015bb8fadfe0b41
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one.
./examples/smtp-appender.js
// Note that smtp appender needs nodemailer to work. // If you haven't got nodemailer installed, you'll get cryptic // "cannot find module" errors when using the smtp appender const log4js = require('../lib/log4js'); log4js.configure({ appenders: { out: { type: 'console', }, mail: { type: '@log4js-node/smtp', recipients: 'logfilerecipient@logging.com', sendInterval: 5, transport: 'SMTP', SMTP: { host: 'smtp.gmail.com', secureConnection: true, port: 465, auth: { user: 'someone@gmail', pass: '********************', }, debug: true, }, }, }, categories: { default: { appenders: ['out'], level: 'info' }, mailer: { appenders: ['mail'], level: 'info' }, }, }); const log = log4js.getLogger('test'); const logmailer = log4js.getLogger('mailer'); function doTheLogging(x) { log.info('Logging something %d', x); logmailer.info('Logging something %d', x); } for (let i = 0; i < 500; i += 1) { doTheLogging(i); }
// Note that smtp appender needs nodemailer to work. // If you haven't got nodemailer installed, you'll get cryptic // "cannot find module" errors when using the smtp appender const log4js = require('../lib/log4js'); log4js.configure({ appenders: { out: { type: 'console', }, mail: { type: '@log4js-node/smtp', recipients: 'logfilerecipient@logging.com', sendInterval: 5, transport: 'SMTP', SMTP: { host: 'smtp.gmail.com', secureConnection: true, port: 465, auth: { user: 'someone@gmail', pass: '********************', }, debug: true, }, }, }, categories: { default: { appenders: ['out'], level: 'info' }, mailer: { appenders: ['mail'], level: 'info' }, }, }); const log = log4js.getLogger('test'); const logmailer = log4js.getLogger('mailer'); function doTheLogging(x) { log.info('Logging something %d', x); logmailer.info('Logging something %d', x); } for (let i = 0; i < 500; i += 1) { doTheLogging(i); }
-1
log4js-node/log4js-node
1,333
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one
lamweili
"2022-10-01T17:05:08Z"
"2022-10-01T17:20:06Z"
570ef530dc02d3e843a5421cb015bb8fadfe0b41
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one.
./test/tap/stdoutAppender-test.js
const { test } = require('tap'); const sandbox = require('@log4js-node/sandboxed-module'); const layouts = require('../../lib/layouts'); test('stdout appender', (t) => { const output = []; const appender = sandbox .require('../../lib/appenders/stdout', { globals: { process: { stdout: { write(data) { output.push(data); }, }, }, }, }) .configure( { type: 'stdout', layout: { type: 'messagePassThrough' } }, layouts ); appender({ data: ['cheese'] }); t.plan(2); t.equal(output.length, 1, 'There should be one message.'); t.equal(output[0], 'cheese\n', 'The message should be cheese.'); t.end(); });
const { test } = require('tap'); const sandbox = require('@log4js-node/sandboxed-module'); const layouts = require('../../lib/layouts'); test('stdout appender', (t) => { const output = []; const appender = sandbox .require('../../lib/appenders/stdout', { globals: { process: { stdout: { write(data) { output.push(data); }, }, }, }, }) .configure( { type: 'stdout', layout: { type: 'messagePassThrough' } }, layouts ); appender({ data: ['cheese'] }); t.plan(2); t.equal(output.length, 1, 'There should be one message.'); t.equal(output[0], 'cheese\n', 'The message should be cheese.'); t.end(); });
-1
log4js-node/log4js-node
1,333
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one
lamweili
"2022-10-01T17:05:08Z"
"2022-10-01T17:20:06Z"
570ef530dc02d3e843a5421cb015bb8fadfe0b41
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one.
./examples/stacktrace.js
const log4js = require('../lib/log4js'); log4js.configure({ appenders: { 'console-appender': { type: 'console', layout: { type: 'pattern', pattern: '%[[%p]%] - %10.-100f{2} | %7.12l:%7.12o - %[%m%]', }, }, }, categories: { default: { appenders: ['console-appender'], enableCallStack: true, level: 'info', }, }, }); log4js.getLogger().info('This should not cause problems');
const log4js = require('../lib/log4js'); log4js.configure({ appenders: { 'console-appender': { type: 'console', layout: { type: 'pattern', pattern: '%[[%p]%] - %10.-100f{2} | %7.12l:%7.12o - %[%m%]', }, }, }, categories: { default: { appenders: ['console-appender'], enableCallStack: true, level: 'info', }, }, }); log4js.getLogger().info('This should not cause problems');
-1
log4js-node/log4js-node
1,333
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one
lamweili
"2022-10-01T17:05:08Z"
"2022-10-01T17:20:06Z"
570ef530dc02d3e843a5421cb015bb8fadfe0b41
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one.
./lib/appenders/multiFile.js
const debug = require('debug')('log4js:multiFile'); const path = require('path'); const fileAppender = require('./file'); const findFileKey = (property, event) => event[property] || event.context[property]; module.exports.configure = (config, layouts) => { debug('Creating a multi-file appender'); const files = new Map(); const timers = new Map(); function checkForTimeout(fileKey) { const timer = timers.get(fileKey); const app = files.get(fileKey); /* istanbul ignore else: failsafe */ if (timer && app) { if (Date.now() - timer.lastUsed > timer.timeout) { debug('%s not used for > %d ms => close', fileKey, timer.timeout); clearInterval(timer.interval); timers.delete(fileKey); files.delete(fileKey); app.shutdown((err) => { if (err) { debug('ignore error on file shutdown: %s', err.message); } }); } } else { // will never get here as files and timers are coupled to be added and deleted at same place debug('timer or app does not exist'); } } const appender = (logEvent) => { const fileKey = findFileKey(config.property, logEvent); debug('fileKey for property ', config.property, ' is ', fileKey); if (fileKey) { let file = files.get(fileKey); debug('existing file appender is ', file); if (!file) { debug('creating new file appender'); config.filename = path.join(config.base, fileKey + config.extension); file = fileAppender.configure(config, layouts); files.set(fileKey, file); if (config.timeout) { debug('creating new timer'); timers.set(fileKey, { timeout: config.timeout, lastUsed: Date.now(), interval: setInterval( checkForTimeout.bind(null, fileKey), config.timeout ), }); } } else if (config.timeout) { debug('%s extending activity', fileKey); timers.get(fileKey).lastUsed = Date.now(); } file(logEvent); } else { debug('No fileKey for logEvent, quietly ignoring this log event'); } }; appender.shutdown = (cb) => { let shutdownFunctions = files.size; if (shutdownFunctions <= 0) { cb(); } let error; timers.forEach((timer, fileKey) => { debug('clearing timer for ', fileKey); clearInterval(timer.interval); }); files.forEach((app, fileKey) => { debug('calling shutdown for ', fileKey); app.shutdown((err) => { error = error || err; shutdownFunctions -= 1; if (shutdownFunctions <= 0) { cb(error); } }); }); }; return appender; };
const debug = require('debug')('log4js:multiFile'); const path = require('path'); const fileAppender = require('./file'); const findFileKey = (property, event) => event[property] || event.context[property]; module.exports.configure = (config, layouts) => { debug('Creating a multi-file appender'); const files = new Map(); const timers = new Map(); function checkForTimeout(fileKey) { const timer = timers.get(fileKey); const app = files.get(fileKey); /* istanbul ignore else: failsafe */ if (timer && app) { if (Date.now() - timer.lastUsed > timer.timeout) { debug('%s not used for > %d ms => close', fileKey, timer.timeout); clearInterval(timer.interval); timers.delete(fileKey); files.delete(fileKey); app.shutdown((err) => { if (err) { debug('ignore error on file shutdown: %s', err.message); } }); } } else { // will never get here as files and timers are coupled to be added and deleted at same place debug('timer or app does not exist'); } } const appender = (logEvent) => { const fileKey = findFileKey(config.property, logEvent); debug('fileKey for property ', config.property, ' is ', fileKey); if (fileKey) { let file = files.get(fileKey); debug('existing file appender is ', file); if (!file) { debug('creating new file appender'); config.filename = path.join(config.base, fileKey + config.extension); file = fileAppender.configure(config, layouts); files.set(fileKey, file); if (config.timeout) { debug('creating new timer'); timers.set(fileKey, { timeout: config.timeout, lastUsed: Date.now(), interval: setInterval( checkForTimeout.bind(null, fileKey), config.timeout ), }); } } else if (config.timeout) { debug('%s extending activity', fileKey); timers.get(fileKey).lastUsed = Date.now(); } file(logEvent); } else { debug('No fileKey for logEvent, quietly ignoring this log event'); } }; appender.shutdown = (cb) => { let shutdownFunctions = files.size; if (shutdownFunctions <= 0) { cb(); } let error; timers.forEach((timer, fileKey) => { debug('clearing timer for ', fileKey); clearInterval(timer.interval); }); files.forEach((app, fileKey) => { debug('calling shutdown for ', fileKey); app.shutdown((err) => { error = error || err; shutdownFunctions -= 1; if (shutdownFunctions <= 0) { cb(error); } }); }); }; return appender; };
-1
log4js-node/log4js-node
1,333
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one
lamweili
"2022-10-01T17:05:08Z"
"2022-10-01T17:20:06Z"
570ef530dc02d3e843a5421cb015bb8fadfe0b41
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one.
./test/tap/fileSyncAppender-test.js
const { test } = require('tap'); const fs = require('fs'); const path = require('path'); const EOL = require('os').EOL || '\n'; const sandbox = require('@log4js-node/sandboxed-module'); const log4js = require('../../lib/log4js'); function remove(filename) { try { fs.unlinkSync(filename); } catch (e) { // doesn't really matter if it failed } } test('log4js fileSyncAppender', (batch) => { batch.test('with default fileSyncAppender settings', (t) => { const testFile = path.join(__dirname, '/fa-default-sync-test.log'); const logger = log4js.getLogger('default-settings'); remove(testFile); t.teardown(() => { remove(testFile); }); log4js.configure({ appenders: { sync: { type: 'fileSync', filename: testFile } }, categories: { default: { appenders: ['sync'], level: 'debug' } }, }); logger.info('This should be in the file.'); fs.readFile(testFile, 'utf8', (err, fileContents) => { t.match(fileContents, `This should be in the file.${EOL}`); t.match( fileContents, /\[\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}\.\d{3}] \[INFO] default-settings - / ); t.end(); }); }); batch.test('with existing file', (t) => { const testFile = path.join(__dirname, '/fa-existing-file-sync-test.log'); const logger = log4js.getLogger('default-settings'); remove(testFile); t.teardown(() => { remove(testFile); }); log4js.configure({ appenders: { sync: { type: 'fileSync', filename: testFile } }, categories: { default: { appenders: ['sync'], level: 'debug' } }, }); logger.info('This should be in the file.'); log4js.shutdown(() => { log4js.configure({ appenders: { sync: { type: 'fileSync', filename: testFile } }, categories: { default: { appenders: ['sync'], level: 'debug' } }, }); logger.info('This should also be in the file.'); fs.readFile(testFile, 'utf8', (err, fileContents) => { t.match(fileContents, `This should be in the file.${EOL}`); t.match(fileContents, `This should also be in the file.${EOL}`); t.match( fileContents, /\[\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}\.\d{3}] \[INFO] default-settings - / ); t.end(); }); }); }); batch.test('should give error if invalid filename', async (t) => { const file = ''; t.throws( () => log4js.configure({ appenders: { file: { type: 'fileSync', filename: file, }, }, categories: { default: { appenders: ['file'], level: 'debug' }, }, }), new Error(`Invalid filename: ${file}`) ); const dir = `.${path.sep}`; t.throws( () => log4js.configure({ appenders: { file: { type: 'fileSync', filename: dir, }, }, categories: { default: { appenders: ['file'], level: 'debug' }, }, }), new Error(`Filename is a directory: ${dir}`) ); t.end(); }); batch.test('should give error if invalid maxLogSize', async (t) => { const maxLogSize = -1; const expectedError = new Error(`maxLogSize (${maxLogSize}) should be > 0`); t.throws( () => log4js.configure({ appenders: { file: { type: 'fileSync', filename: path.join( __dirname, 'fa-invalidMaxFileSize-sync-test.log' ), maxLogSize: -1, }, }, categories: { default: { appenders: ['file'], level: 'debug' }, }, }), expectedError ); t.end(); }); batch.test('with a max file size and no backups', (t) => { const testFile = path.join(__dirname, '/fa-maxFileSize-sync-test.log'); const logger = log4js.getLogger('max-file-size'); remove(testFile); t.teardown(() => { remove(testFile); }); // log file of 100 bytes maximum, no backups log4js.configure({ appenders: { sync: { type: 'fileSync', filename: testFile, maxLogSize: 100, backups: 0, }, }, categories: { default: { appenders: ['sync'], level: 'debug' } }, }); logger.info('This is the first log message.'); logger.info('This is an intermediate log message.'); logger.info('This is the second log message.'); t.test('log file should only contain the second message', (assert) => { fs.readFile(testFile, 'utf8', (err, fileContents) => { assert.match(fileContents, `This is the second log message.${EOL}`); assert.equal( fileContents.indexOf('This is the first log message.'), -1 ); assert.end(); }); }); t.test('there should be one test files', (assert) => { fs.readdir(__dirname, (err, files) => { const logFiles = files.filter((file) => file.includes('fa-maxFileSize-sync-test.log') ); assert.equal(logFiles.length, 1); assert.end(); }); }); t.end(); }); batch.test('with a max file size in unit mode and no backups', (t) => { const testFile = path.join(__dirname, '/fa-maxFileSize-unit-sync-test.log'); const logger = log4js.getLogger('max-file-size-unit'); remove(testFile); remove(`${testFile}.1`); t.teardown(() => { remove(testFile); remove(`${testFile}.1`); }); // log file of 100 bytes maximum, no backups log4js.configure({ appenders: { sync: { type: 'fileSync', filename: testFile, maxLogSize: '1K', backups: 0, layout: { type: 'messagePassThrough' }, }, }, categories: { default: { appenders: ['sync'], level: 'debug' } }, }); const maxLine = 22; // 1024 max file size / 47 bytes per line for (let i = 0; i < maxLine; i++) { logger.info('These are the log messages for the first file.'); // 46 bytes per line + '\n' } logger.info('This is the second log message.'); t.test('log file should only contain the second message', (assert) => { fs.readFile(testFile, 'utf8', (err, fileContents) => { assert.match(fileContents, `This is the second log message.${EOL}`); assert.notMatch( fileContents, 'These are the log messages for the first file.' ); assert.end(); }); }); t.test('there should be one test file', (assert) => { fs.readdir(__dirname, (err, files) => { const logFiles = files.filter((file) => file.includes('fa-maxFileSize-unit-sync-test.log') ); assert.equal(logFiles.length, 1); assert.end(); }); }); t.end(); }); batch.test('with a max file size and 2 backups', (t) => { const testFile = path.join( __dirname, '/fa-maxFileSize-with-backups-sync-test.log' ); const logger = log4js.getLogger('max-file-size-backups'); remove(testFile); remove(`${testFile}.1`); remove(`${testFile}.2`); t.teardown(() => { remove(testFile); remove(`${testFile}.1`); remove(`${testFile}.2`); }); // log file of 50 bytes maximum, 2 backups log4js.configure({ appenders: { sync: { type: 'fileSync', filename: testFile, maxLogSize: 50, backups: 2, }, }, categories: { default: { appenders: ['sync'], level: 'debug' } }, }); logger.info('This is the first log message.'); logger.info('This is the second log message.'); logger.info('This is the third log message.'); logger.info('This is the fourth log message.'); t.test('the log files', (assert) => { assert.plan(5); fs.readdir(__dirname, (err, files) => { const logFiles = files.filter((file) => file.includes('fa-maxFileSize-with-backups-sync-test.log') ); assert.equal(logFiles.length, 3, 'should be 3 files'); assert.same( logFiles, [ 'fa-maxFileSize-with-backups-sync-test.log', 'fa-maxFileSize-with-backups-sync-test.log.1', 'fa-maxFileSize-with-backups-sync-test.log.2', ], 'should be named in sequence' ); fs.readFile( path.join(__dirname, logFiles[0]), 'utf8', (e, contents) => { assert.match(contents, 'This is the fourth log message.'); } ); fs.readFile( path.join(__dirname, logFiles[1]), 'utf8', (e, contents) => { assert.match(contents, 'This is the third log message.'); } ); fs.readFile( path.join(__dirname, logFiles[2]), 'utf8', (e, contents) => { assert.match(contents, 'This is the second log message.'); } ); }); }); t.end(); }); batch.test('configure with fileSyncAppender', (t) => { const testFile = 'tmp-sync-tests.log'; remove(testFile); t.teardown(() => { remove(testFile); }); // this config defines one file appender (to ./tmp-sync-tests.log) // and sets the log level for "tests" to WARN log4js.configure({ appenders: { sync: { type: 'fileSync', filename: testFile, layout: { type: 'messagePassThrough' }, }, }, categories: { default: { appenders: ['sync'], level: 'debug' }, tests: { appenders: ['sync'], level: 'warn' }, }, }); const logger = log4js.getLogger('tests'); logger.info('this should not be written to the file'); logger.warn('this should be written to the file'); fs.readFile(testFile, 'utf8', (err, contents) => { t.match(contents, `this should be written to the file${EOL}`); t.equal(contents.indexOf('this should not be written to the file'), -1); t.end(); }); }); batch.test( 'configure with non-existent multi-directory (recursive, nodejs >= 10.12.0)', (t) => { const testFile = 'tmpA/tmpB/tmpC/tmp-sync-tests-recursive.log'; remove(testFile); t.teardown(() => { remove(testFile); try { fs.rmdirSync('tmpA/tmpB/tmpC'); fs.rmdirSync('tmpA/tmpB'); fs.rmdirSync('tmpA'); } catch (e) { // doesn't matter } }); log4js.configure({ appenders: { sync: { type: 'fileSync', filename: testFile, layout: { type: 'messagePassThrough' }, }, }, categories: { default: { appenders: ['sync'], level: 'debug' }, }, }); const logger = log4js.getLogger(); logger.info('this should be written to the file'); fs.readFile(testFile, 'utf8', (err, contents) => { t.match(contents, `this should be written to the file${EOL}`); t.end(); }); } ); batch.test( 'configure with non-existent multi-directory (non-recursive, nodejs < 10.12.0)', (t) => { const testFile = 'tmpA/tmpB/tmpC/tmp-sync-tests-non-recursive.log'; remove(testFile); t.teardown(() => { remove(testFile); try { fs.rmdirSync('tmpA/tmpB/tmpC'); fs.rmdirSync('tmpA/tmpB'); fs.rmdirSync('tmpA'); } catch (e) { // doesn't matter } }); const sandboxedLog4js = sandbox.require('../../lib/log4js', { requires: { fs: { ...fs, mkdirSync(dirPath, options) { return fs.mkdirSync(dirPath, { ...options, ...{ recursive: false }, }); }, }, }, }); sandboxedLog4js.configure({ appenders: { sync: { type: 'fileSync', filename: testFile, layout: { type: 'messagePassThrough' }, }, }, categories: { default: { appenders: ['sync'], level: 'debug' }, }, }); const logger = sandboxedLog4js.getLogger(); logger.info('this should be written to the file'); fs.readFile(testFile, 'utf8', (err, contents) => { t.match(contents, `this should be written to the file${EOL}`); t.end(); }); } ); batch.test( 'configure with non-existent multi-directory (error handling)', (t) => { const testFile = 'tmpA/tmpB/tmpC/tmp-sync-tests-error-handling.log'; remove(testFile); t.teardown(() => { remove(testFile); try { fs.rmdirSync('tmpA/tmpB/tmpC'); fs.rmdirSync('tmpA/tmpB'); fs.rmdirSync('tmpA'); } catch (e) { // doesn't matter } }); const errorEPERM = new Error('EPERM'); errorEPERM.code = 'EPERM'; let sandboxedLog4js = sandbox.require('../../lib/log4js', { requires: { fs: { ...fs, mkdirSync() { throw errorEPERM; }, }, }, }); t.throws( () => sandboxedLog4js.configure({ appenders: { sync: { type: 'fileSync', filename: testFile, layout: { type: 'messagePassThrough' }, }, }, categories: { default: { appenders: ['sync'], level: 'debug' }, }, }), errorEPERM ); const errorEROFS = new Error('EROFS'); errorEROFS.code = 'EROFS'; sandboxedLog4js = sandbox.require('../../lib/log4js', { requires: { fs: { ...fs, mkdirSync() { throw errorEROFS; }, statSync() { return { isDirectory() { return false; }, }; }, }, }, }); t.throws( () => sandboxedLog4js.configure({ appenders: { sync: { type: 'fileSync', filename: testFile, layout: { type: 'messagePassThrough' }, }, }, categories: { default: { appenders: ['sync'], level: 'debug' }, }, }), errorEROFS ); fs.mkdirSync('tmpA'); fs.mkdirSync('tmpA/tmpB'); fs.mkdirSync('tmpA/tmpB/tmpC'); sandboxedLog4js = sandbox.require('../../lib/log4js', { requires: { fs: { ...fs, mkdirSync() { throw errorEROFS; }, }, }, }); t.doesNotThrow(() => sandboxedLog4js.configure({ appenders: { sync: { type: 'fileSync', filename: testFile, layout: { type: 'messagePassThrough' }, }, }, categories: { default: { appenders: ['sync'], level: 'debug' }, }, }) ); t.end(); } ); batch.test('test options', (t) => { const testFile = 'tmp-options-tests.log'; remove(testFile); t.teardown(() => { remove(testFile); }); // using non-standard options log4js.configure({ appenders: { sync: { type: 'fileSync', filename: testFile, layout: { type: 'messagePassThrough' }, flags: 'w', encoding: 'ascii', mode: 0o666, }, }, categories: { default: { appenders: ['sync'], level: 'info' }, }, }); const logger = log4js.getLogger(); logger.warn('log message'); fs.readFile(testFile, 'ascii', (err, contents) => { t.match(contents, `log message${EOL}`); t.end(); }); }); batch.end(); });
const { test } = require('tap'); const fs = require('fs'); const path = require('path'); const EOL = require('os').EOL || '\n'; const sandbox = require('@log4js-node/sandboxed-module'); const log4js = require('../../lib/log4js'); function remove(filename) { try { fs.unlinkSync(filename); } catch (e) { // doesn't really matter if it failed } } test('log4js fileSyncAppender', (batch) => { batch.test('with default fileSyncAppender settings', (t) => { const testFile = path.join(__dirname, '/fa-default-sync-test.log'); const logger = log4js.getLogger('default-settings'); remove(testFile); t.teardown(() => { remove(testFile); }); log4js.configure({ appenders: { sync: { type: 'fileSync', filename: testFile } }, categories: { default: { appenders: ['sync'], level: 'debug' } }, }); logger.info('This should be in the file.'); fs.readFile(testFile, 'utf8', (err, fileContents) => { t.match(fileContents, `This should be in the file.${EOL}`); t.match( fileContents, /\[\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}\.\d{3}] \[INFO] default-settings - / ); t.end(); }); }); batch.test('with existing file', (t) => { const testFile = path.join(__dirname, '/fa-existing-file-sync-test.log'); const logger = log4js.getLogger('default-settings'); remove(testFile); t.teardown(() => { remove(testFile); }); log4js.configure({ appenders: { sync: { type: 'fileSync', filename: testFile } }, categories: { default: { appenders: ['sync'], level: 'debug' } }, }); logger.info('This should be in the file.'); log4js.shutdown(() => { log4js.configure({ appenders: { sync: { type: 'fileSync', filename: testFile } }, categories: { default: { appenders: ['sync'], level: 'debug' } }, }); logger.info('This should also be in the file.'); fs.readFile(testFile, 'utf8', (err, fileContents) => { t.match(fileContents, `This should be in the file.${EOL}`); t.match(fileContents, `This should also be in the file.${EOL}`); t.match( fileContents, /\[\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}\.\d{3}] \[INFO] default-settings - / ); t.end(); }); }); }); batch.test('should give error if invalid filename', async (t) => { const file = ''; t.throws( () => log4js.configure({ appenders: { file: { type: 'fileSync', filename: file, }, }, categories: { default: { appenders: ['file'], level: 'debug' }, }, }), new Error(`Invalid filename: ${file}`) ); const dir = `.${path.sep}`; t.throws( () => log4js.configure({ appenders: { file: { type: 'fileSync', filename: dir, }, }, categories: { default: { appenders: ['file'], level: 'debug' }, }, }), new Error(`Filename is a directory: ${dir}`) ); t.end(); }); batch.test('should give error if invalid maxLogSize', async (t) => { const maxLogSize = -1; const expectedError = new Error(`maxLogSize (${maxLogSize}) should be > 0`); t.throws( () => log4js.configure({ appenders: { file: { type: 'fileSync', filename: path.join( __dirname, 'fa-invalidMaxFileSize-sync-test.log' ), maxLogSize: -1, }, }, categories: { default: { appenders: ['file'], level: 'debug' }, }, }), expectedError ); t.end(); }); batch.test('with a max file size and no backups', (t) => { const testFile = path.join(__dirname, '/fa-maxFileSize-sync-test.log'); const logger = log4js.getLogger('max-file-size'); remove(testFile); t.teardown(() => { remove(testFile); }); // log file of 100 bytes maximum, no backups log4js.configure({ appenders: { sync: { type: 'fileSync', filename: testFile, maxLogSize: 100, backups: 0, }, }, categories: { default: { appenders: ['sync'], level: 'debug' } }, }); logger.info('This is the first log message.'); logger.info('This is an intermediate log message.'); logger.info('This is the second log message.'); t.test('log file should only contain the second message', (assert) => { fs.readFile(testFile, 'utf8', (err, fileContents) => { assert.match(fileContents, `This is the second log message.${EOL}`); assert.equal( fileContents.indexOf('This is the first log message.'), -1 ); assert.end(); }); }); t.test('there should be one test files', (assert) => { fs.readdir(__dirname, (err, files) => { const logFiles = files.filter((file) => file.includes('fa-maxFileSize-sync-test.log') ); assert.equal(logFiles.length, 1); assert.end(); }); }); t.end(); }); batch.test('with a max file size in unit mode and no backups', (t) => { const testFile = path.join(__dirname, '/fa-maxFileSize-unit-sync-test.log'); const logger = log4js.getLogger('max-file-size-unit'); remove(testFile); remove(`${testFile}.1`); t.teardown(() => { remove(testFile); remove(`${testFile}.1`); }); // log file of 100 bytes maximum, no backups log4js.configure({ appenders: { sync: { type: 'fileSync', filename: testFile, maxLogSize: '1K', backups: 0, layout: { type: 'messagePassThrough' }, }, }, categories: { default: { appenders: ['sync'], level: 'debug' } }, }); const maxLine = 22; // 1024 max file size / 47 bytes per line for (let i = 0; i < maxLine; i++) { logger.info('These are the log messages for the first file.'); // 46 bytes per line + '\n' } logger.info('This is the second log message.'); t.test('log file should only contain the second message', (assert) => { fs.readFile(testFile, 'utf8', (err, fileContents) => { assert.match(fileContents, `This is the second log message.${EOL}`); assert.notMatch( fileContents, 'These are the log messages for the first file.' ); assert.end(); }); }); t.test('there should be one test file', (assert) => { fs.readdir(__dirname, (err, files) => { const logFiles = files.filter((file) => file.includes('fa-maxFileSize-unit-sync-test.log') ); assert.equal(logFiles.length, 1); assert.end(); }); }); t.end(); }); batch.test('with a max file size and 2 backups', (t) => { const testFile = path.join( __dirname, '/fa-maxFileSize-with-backups-sync-test.log' ); const logger = log4js.getLogger('max-file-size-backups'); remove(testFile); remove(`${testFile}.1`); remove(`${testFile}.2`); t.teardown(() => { remove(testFile); remove(`${testFile}.1`); remove(`${testFile}.2`); }); // log file of 50 bytes maximum, 2 backups log4js.configure({ appenders: { sync: { type: 'fileSync', filename: testFile, maxLogSize: 50, backups: 2, }, }, categories: { default: { appenders: ['sync'], level: 'debug' } }, }); logger.info('This is the first log message.'); logger.info('This is the second log message.'); logger.info('This is the third log message.'); logger.info('This is the fourth log message.'); t.test('the log files', (assert) => { assert.plan(5); fs.readdir(__dirname, (err, files) => { const logFiles = files.filter((file) => file.includes('fa-maxFileSize-with-backups-sync-test.log') ); assert.equal(logFiles.length, 3, 'should be 3 files'); assert.same( logFiles, [ 'fa-maxFileSize-with-backups-sync-test.log', 'fa-maxFileSize-with-backups-sync-test.log.1', 'fa-maxFileSize-with-backups-sync-test.log.2', ], 'should be named in sequence' ); fs.readFile( path.join(__dirname, logFiles[0]), 'utf8', (e, contents) => { assert.match(contents, 'This is the fourth log message.'); } ); fs.readFile( path.join(__dirname, logFiles[1]), 'utf8', (e, contents) => { assert.match(contents, 'This is the third log message.'); } ); fs.readFile( path.join(__dirname, logFiles[2]), 'utf8', (e, contents) => { assert.match(contents, 'This is the second log message.'); } ); }); }); t.end(); }); batch.test('configure with fileSyncAppender', (t) => { const testFile = 'tmp-sync-tests.log'; remove(testFile); t.teardown(() => { remove(testFile); }); // this config defines one file appender (to ./tmp-sync-tests.log) // and sets the log level for "tests" to WARN log4js.configure({ appenders: { sync: { type: 'fileSync', filename: testFile, layout: { type: 'messagePassThrough' }, }, }, categories: { default: { appenders: ['sync'], level: 'debug' }, tests: { appenders: ['sync'], level: 'warn' }, }, }); const logger = log4js.getLogger('tests'); logger.info('this should not be written to the file'); logger.warn('this should be written to the file'); fs.readFile(testFile, 'utf8', (err, contents) => { t.match(contents, `this should be written to the file${EOL}`); t.equal(contents.indexOf('this should not be written to the file'), -1); t.end(); }); }); batch.test( 'configure with non-existent multi-directory (recursive, nodejs >= 10.12.0)', (t) => { const testFile = 'tmpA/tmpB/tmpC/tmp-sync-tests-recursive.log'; remove(testFile); t.teardown(() => { remove(testFile); try { fs.rmdirSync('tmpA/tmpB/tmpC'); fs.rmdirSync('tmpA/tmpB'); fs.rmdirSync('tmpA'); } catch (e) { // doesn't matter } }); log4js.configure({ appenders: { sync: { type: 'fileSync', filename: testFile, layout: { type: 'messagePassThrough' }, }, }, categories: { default: { appenders: ['sync'], level: 'debug' }, }, }); const logger = log4js.getLogger(); logger.info('this should be written to the file'); fs.readFile(testFile, 'utf8', (err, contents) => { t.match(contents, `this should be written to the file${EOL}`); t.end(); }); } ); batch.test( 'configure with non-existent multi-directory (non-recursive, nodejs < 10.12.0)', (t) => { const testFile = 'tmpA/tmpB/tmpC/tmp-sync-tests-non-recursive.log'; remove(testFile); t.teardown(() => { remove(testFile); try { fs.rmdirSync('tmpA/tmpB/tmpC'); fs.rmdirSync('tmpA/tmpB'); fs.rmdirSync('tmpA'); } catch (e) { // doesn't matter } }); const sandboxedLog4js = sandbox.require('../../lib/log4js', { requires: { fs: { ...fs, mkdirSync(dirPath, options) { return fs.mkdirSync(dirPath, { ...options, ...{ recursive: false }, }); }, }, }, }); sandboxedLog4js.configure({ appenders: { sync: { type: 'fileSync', filename: testFile, layout: { type: 'messagePassThrough' }, }, }, categories: { default: { appenders: ['sync'], level: 'debug' }, }, }); const logger = sandboxedLog4js.getLogger(); logger.info('this should be written to the file'); fs.readFile(testFile, 'utf8', (err, contents) => { t.match(contents, `this should be written to the file${EOL}`); t.end(); }); } ); batch.test( 'configure with non-existent multi-directory (error handling)', (t) => { const testFile = 'tmpA/tmpB/tmpC/tmp-sync-tests-error-handling.log'; remove(testFile); t.teardown(() => { remove(testFile); try { fs.rmdirSync('tmpA/tmpB/tmpC'); fs.rmdirSync('tmpA/tmpB'); fs.rmdirSync('tmpA'); } catch (e) { // doesn't matter } }); const errorEPERM = new Error('EPERM'); errorEPERM.code = 'EPERM'; let sandboxedLog4js = sandbox.require('../../lib/log4js', { requires: { fs: { ...fs, mkdirSync() { throw errorEPERM; }, }, }, }); t.throws( () => sandboxedLog4js.configure({ appenders: { sync: { type: 'fileSync', filename: testFile, layout: { type: 'messagePassThrough' }, }, }, categories: { default: { appenders: ['sync'], level: 'debug' }, }, }), errorEPERM ); const errorEROFS = new Error('EROFS'); errorEROFS.code = 'EROFS'; sandboxedLog4js = sandbox.require('../../lib/log4js', { requires: { fs: { ...fs, mkdirSync() { throw errorEROFS; }, statSync() { return { isDirectory() { return false; }, }; }, }, }, }); t.throws( () => sandboxedLog4js.configure({ appenders: { sync: { type: 'fileSync', filename: testFile, layout: { type: 'messagePassThrough' }, }, }, categories: { default: { appenders: ['sync'], level: 'debug' }, }, }), errorEROFS ); fs.mkdirSync('tmpA'); fs.mkdirSync('tmpA/tmpB'); fs.mkdirSync('tmpA/tmpB/tmpC'); sandboxedLog4js = sandbox.require('../../lib/log4js', { requires: { fs: { ...fs, mkdirSync() { throw errorEROFS; }, }, }, }); t.doesNotThrow(() => sandboxedLog4js.configure({ appenders: { sync: { type: 'fileSync', filename: testFile, layout: { type: 'messagePassThrough' }, }, }, categories: { default: { appenders: ['sync'], level: 'debug' }, }, }) ); t.end(); } ); batch.test('test options', (t) => { const testFile = 'tmp-options-tests.log'; remove(testFile); t.teardown(() => { remove(testFile); }); // using non-standard options log4js.configure({ appenders: { sync: { type: 'fileSync', filename: testFile, layout: { type: 'messagePassThrough' }, flags: 'w', encoding: 'ascii', mode: 0o666, }, }, categories: { default: { appenders: ['sync'], level: 'info' }, }, }); const logger = log4js.getLogger(); logger.warn('log message'); fs.readFile(testFile, 'ascii', (err, contents) => { t.match(contents, `log message${EOL}`); t.end(); }); }); batch.end(); });
-1
log4js-node/log4js-node
1,333
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one
lamweili
"2022-10-01T17:05:08Z"
"2022-10-01T17:20:06Z"
570ef530dc02d3e843a5421cb015bb8fadfe0b41
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one.
./test/tap/consoleAppender-test.js
const { test } = require('tap'); const sandbox = require('@log4js-node/sandboxed-module'); const consoleAppender = require('../../lib/appenders/console'); test('log4js console appender', (batch) => { batch.test('should export a configure function', (t) => { t.type(consoleAppender.configure, 'function'); t.end(); }); batch.test('should use default layout if none specified', (t) => { const messages = []; const fakeConsole = { log(msg) { messages.push(msg); }, }; const log4js = sandbox.require('../../lib/log4js', { globals: { console: fakeConsole, }, }); log4js.configure({ appenders: { console: { type: 'console' } }, categories: { default: { appenders: ['console'], level: 'DEBUG' } }, }); log4js.getLogger().info('blah'); t.match(messages[0], /.*default.*blah/); t.end(); }); batch.test('should output to console', (t) => { const messages = []; const fakeConsole = { log(msg) { messages.push(msg); }, }; const log4js = sandbox.require('../../lib/log4js', { globals: { console: fakeConsole, }, }); log4js.configure({ appenders: { console: { type: 'console', layout: { type: 'messagePassThrough' } }, }, categories: { default: { appenders: ['console'], level: 'DEBUG' } }, }); log4js.getLogger().info('blah'); t.equal(messages[0], 'blah'); t.end(); }); batch.end(); });
const { test } = require('tap'); const sandbox = require('@log4js-node/sandboxed-module'); const consoleAppender = require('../../lib/appenders/console'); test('log4js console appender', (batch) => { batch.test('should export a configure function', (t) => { t.type(consoleAppender.configure, 'function'); t.end(); }); batch.test('should use default layout if none specified', (t) => { const messages = []; const fakeConsole = { log(msg) { messages.push(msg); }, }; const log4js = sandbox.require('../../lib/log4js', { globals: { console: fakeConsole, }, }); log4js.configure({ appenders: { console: { type: 'console' } }, categories: { default: { appenders: ['console'], level: 'DEBUG' } }, }); log4js.getLogger().info('blah'); t.match(messages[0], /.*default.*blah/); t.end(); }); batch.test('should output to console', (t) => { const messages = []; const fakeConsole = { log(msg) { messages.push(msg); }, }; const log4js = sandbox.require('../../lib/log4js', { globals: { console: fakeConsole, }, }); log4js.configure({ appenders: { console: { type: 'console', layout: { type: 'messagePassThrough' } }, }, categories: { default: { appenders: ['console'], level: 'DEBUG' } }, }); log4js.getLogger().info('blah'); t.equal(messages[0], 'blah'); t.end(); }); batch.end(); });
-1
log4js-node/log4js-node
1,333
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one
lamweili
"2022-10-01T17:05:08Z"
"2022-10-01T17:20:06Z"
570ef530dc02d3e843a5421cb015bb8fadfe0b41
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one.
./lib/appenders/file.js
const debug = require('debug')('log4js:file'); const path = require('path'); const streams = require('streamroller'); const os = require('os'); const eol = os.EOL; let mainSighupListenerStarted = false; const sighupListeners = new Set(); function mainSighupHandler() { sighupListeners.forEach((app) => { app.sighupHandler(); }); } /** * File Appender writing the logs to a text file. Supports rolling of logs by size. * * @param file the file log messages will be written to * @param layout a function that takes a logEvent and returns a string * (defaults to basicLayout). * @param logSize - the maximum size (in bytes) for a log file, * if not provided then logs won't be rotated. * @param numBackups - the number of log files to keep after logSize * has been reached (default 5) * @param options - options to be passed to the underlying stream * @param timezoneOffset - optional timezone offset in minutes (default system local) */ function fileAppender( file, layout, logSize, numBackups, options, timezoneOffset ) { if (typeof file !== 'string' || file.length === 0) { throw new Error(`Invalid filename: ${file}`); } else if (file.endsWith(path.sep)) { throw new Error(`Filename is a directory: ${file}`); } else { // handle ~ expansion: https://github.com/nodejs/node/issues/684 // exclude ~ and ~filename as these can be valid files file = file.replace(new RegExp(`^~(?=${path.sep}.+)`), os.homedir()); } file = path.normalize(file); numBackups = !numBackups && numBackups !== 0 ? 5 : numBackups; debug( 'Creating file appender (', file, ', ', logSize, ', ', numBackups, ', ', options, ', ', timezoneOffset, ')' ); function openTheStream(filePath, fileSize, numFiles, opt) { const stream = new streams.RollingFileStream( filePath, fileSize, numFiles, opt ); stream.on('error', (err) => { // eslint-disable-next-line no-console console.error( 'log4js.fileAppender - Writing to file %s, error happened ', filePath, err ); }); stream.on('drain', () => { process.emit('log4js:pause', false); }); return stream; } let writer = openTheStream(file, logSize, numBackups, options); const app = function (loggingEvent) { if (!writer.writable) { return; } if (options.removeColor === true) { // eslint-disable-next-line no-control-regex const regex = /\x1b[[0-9;]*m/g; loggingEvent.data = loggingEvent.data.map((d) => { if (typeof d === 'string') return d.replace(regex, ''); return d; }); } if (!writer.write(layout(loggingEvent, timezoneOffset) + eol, 'utf8')) { process.emit('log4js:pause', true); } }; app.reopen = function () { writer.end(() => { writer = openTheStream(file, logSize, numBackups, options); }); }; app.sighupHandler = function () { debug('SIGHUP handler called.'); app.reopen(); }; app.shutdown = function (complete) { sighupListeners.delete(app); if (sighupListeners.size === 0 && mainSighupListenerStarted) { process.removeListener('SIGHUP', mainSighupHandler); mainSighupListenerStarted = false; } writer.end('', 'utf-8', complete); }; // On SIGHUP, close and reopen all files. This allows this appender to work with // logrotate. Note that if you are using logrotate, you should not set // `logSize`. sighupListeners.add(app); if (!mainSighupListenerStarted) { process.on('SIGHUP', mainSighupHandler); mainSighupListenerStarted = true; } return app; } function configure(config, layouts) { let layout = layouts.basicLayout; if (config.layout) { layout = layouts.layout(config.layout.type, config.layout); } // security default (instead of relying on streamroller default) config.mode = config.mode || 0o600; return fileAppender( config.filename, layout, config.maxLogSize, config.backups, config, config.timezoneOffset ); } module.exports.configure = configure;
const debug = require('debug')('log4js:file'); const path = require('path'); const streams = require('streamroller'); const os = require('os'); const eol = os.EOL; let mainSighupListenerStarted = false; const sighupListeners = new Set(); function mainSighupHandler() { sighupListeners.forEach((app) => { app.sighupHandler(); }); } /** * File Appender writing the logs to a text file. Supports rolling of logs by size. * * @param file the file log messages will be written to * @param layout a function that takes a logEvent and returns a string * (defaults to basicLayout). * @param logSize - the maximum size (in bytes) for a log file, * if not provided then logs won't be rotated. * @param numBackups - the number of log files to keep after logSize * has been reached (default 5) * @param options - options to be passed to the underlying stream * @param timezoneOffset - optional timezone offset in minutes (default system local) */ function fileAppender( file, layout, logSize, numBackups, options, timezoneOffset ) { if (typeof file !== 'string' || file.length === 0) { throw new Error(`Invalid filename: ${file}`); } else if (file.endsWith(path.sep)) { throw new Error(`Filename is a directory: ${file}`); } else { // handle ~ expansion: https://github.com/nodejs/node/issues/684 // exclude ~ and ~filename as these can be valid files file = file.replace(new RegExp(`^~(?=${path.sep}.+)`), os.homedir()); } file = path.normalize(file); numBackups = !numBackups && numBackups !== 0 ? 5 : numBackups; debug( 'Creating file appender (', file, ', ', logSize, ', ', numBackups, ', ', options, ', ', timezoneOffset, ')' ); function openTheStream(filePath, fileSize, numFiles, opt) { const stream = new streams.RollingFileStream( filePath, fileSize, numFiles, opt ); stream.on('error', (err) => { // eslint-disable-next-line no-console console.error( 'log4js.fileAppender - Writing to file %s, error happened ', filePath, err ); }); stream.on('drain', () => { process.emit('log4js:pause', false); }); return stream; } let writer = openTheStream(file, logSize, numBackups, options); const app = function (loggingEvent) { if (!writer.writable) { return; } if (options.removeColor === true) { // eslint-disable-next-line no-control-regex const regex = /\x1b[[0-9;]*m/g; loggingEvent.data = loggingEvent.data.map((d) => { if (typeof d === 'string') return d.replace(regex, ''); return d; }); } if (!writer.write(layout(loggingEvent, timezoneOffset) + eol, 'utf8')) { process.emit('log4js:pause', true); } }; app.reopen = function () { writer.end(() => { writer = openTheStream(file, logSize, numBackups, options); }); }; app.sighupHandler = function () { debug('SIGHUP handler called.'); app.reopen(); }; app.shutdown = function (complete) { sighupListeners.delete(app); if (sighupListeners.size === 0 && mainSighupListenerStarted) { process.removeListener('SIGHUP', mainSighupHandler); mainSighupListenerStarted = false; } writer.end('', 'utf-8', complete); }; // On SIGHUP, close and reopen all files. This allows this appender to work with // logrotate. Note that if you are using logrotate, you should not set // `logSize`. sighupListeners.add(app); if (!mainSighupListenerStarted) { process.on('SIGHUP', mainSighupHandler); mainSighupListenerStarted = true; } return app; } function configure(config, layouts) { let layout = layouts.basicLayout; if (config.layout) { layout = layouts.layout(config.layout.type, config.layout); } // security default (instead of relying on streamroller default) config.mode = config.mode || 0o600; return fileAppender( config.filename, layout, config.maxLogSize, config.backups, config, config.timezoneOffset ); } module.exports.configure = configure;
-1
log4js-node/log4js-node
1,333
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one
lamweili
"2022-10-01T17:05:08Z"
"2022-10-01T17:20:06Z"
570ef530dc02d3e843a5421cb015bb8fadfe0b41
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one.
./test/tap/configuration-test.js
const { test } = require('tap'); const sandbox = require('@log4js-node/sandboxed-module'); const realFS = require('fs'); const modulePath = 'some/path/to/mylog4js.json'; const pathsChecked = []; let fakeFS = {}; let dependencies; let fileRead; test('log4js configure', (batch) => { batch.beforeEach((done) => { fileRead = 0; fakeFS = { realpath: realFS.realpath, // fs-extra looks for this ReadStream: realFS.ReadStream, // need to define these, because graceful-fs uses them WriteStream: realFS.WriteStream, read: realFS.read, closeSync: () => {}, config: { appenders: { console: { type: 'console', layout: { type: 'messagePassThrough' }, }, }, categories: { default: { appenders: ['console'], level: 'INFO', }, }, }, readdirSync: (dir) => require('fs').readdirSync(dir), readFileSync: (file, encoding) => { fileRead += 1; batch.type(file, 'string'); batch.equal(file, modulePath); batch.equal(encoding, 'utf8'); return JSON.stringify(fakeFS.config); }, statSync: (path) => { pathsChecked.push(path); if (path === modulePath) { return { mtime: new Date() }; } throw new Error('no such file'); }, }; dependencies = { requires: { fs: fakeFS, }, }; if (typeof done === 'function') { done(); } }); batch.test( 'when configuration file loaded via LOG4JS_CONFIG env variable', (t) => { process.env.LOG4JS_CONFIG = 'some/path/to/mylog4js.json'; const log4js = sandbox.require('../../lib/log4js', dependencies); log4js.getLogger('test-logger'); t.equal(fileRead, 1, 'should load the specified local config file'); delete process.env.LOG4JS_CONFIG; t.end(); } ); batch.test( 'when configuration is set via configure() method call, return the log4js object', (t) => { const log4js = sandbox .require('../../lib/log4js', dependencies) .configure(fakeFS.config); t.type( log4js, 'object', 'Configure method call should return the log4js object!' ); const log = log4js.getLogger('daemon'); t.type( log, 'object', 'log4js object, returned by configure(...) method should be able to create log object.' ); t.type(log.info, 'function'); t.end(); } ); batch.end(); });
const { test } = require('tap'); const sandbox = require('@log4js-node/sandboxed-module'); const realFS = require('fs'); const modulePath = 'some/path/to/mylog4js.json'; const pathsChecked = []; let fakeFS = {}; let dependencies; let fileRead; test('log4js configure', (batch) => { batch.beforeEach((done) => { fileRead = 0; fakeFS = { realpath: realFS.realpath, // fs-extra looks for this ReadStream: realFS.ReadStream, // need to define these, because graceful-fs uses them WriteStream: realFS.WriteStream, read: realFS.read, closeSync: () => {}, config: { appenders: { console: { type: 'console', layout: { type: 'messagePassThrough' }, }, }, categories: { default: { appenders: ['console'], level: 'INFO', }, }, }, readdirSync: (dir) => require('fs').readdirSync(dir), readFileSync: (file, encoding) => { fileRead += 1; batch.type(file, 'string'); batch.equal(file, modulePath); batch.equal(encoding, 'utf8'); return JSON.stringify(fakeFS.config); }, statSync: (path) => { pathsChecked.push(path); if (path === modulePath) { return { mtime: new Date() }; } throw new Error('no such file'); }, }; dependencies = { requires: { fs: fakeFS, }, }; if (typeof done === 'function') { done(); } }); batch.test( 'when configuration file loaded via LOG4JS_CONFIG env variable', (t) => { process.env.LOG4JS_CONFIG = 'some/path/to/mylog4js.json'; const log4js = sandbox.require('../../lib/log4js', dependencies); log4js.getLogger('test-logger'); t.equal(fileRead, 1, 'should load the specified local config file'); delete process.env.LOG4JS_CONFIG; t.end(); } ); batch.test( 'when configuration is set via configure() method call, return the log4js object', (t) => { const log4js = sandbox .require('../../lib/log4js', dependencies) .configure(fakeFS.config); t.type( log4js, 'object', 'Configure method call should return the log4js object!' ); const log = log4js.getLogger('daemon'); t.type( log, 'object', 'log4js object, returned by configure(...) method should be able to create log object.' ); t.type(log.info, 'function'); t.end(); } ); batch.end(); });
-1
log4js-node/log4js-node
1,333
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one
lamweili
"2022-10-01T17:05:08Z"
"2022-10-01T17:20:06Z"
570ef530dc02d3e843a5421cb015bb8fadfe0b41
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one.
./examples/fromreadme.js
// remember to change the require to just 'log4js' if you've npm install'ed it const log4js = require('../lib/log4js'); log4js.configure({ appenders: { cheese: { type: 'file', filename: 'cheese.log' } }, categories: { default: { appenders: ['cheese'], level: 'error' } }, }); const logger = log4js.getLogger('cheese'); logger.level = 'ERROR'; logger.trace('Entering cheese testing'); logger.debug('Got cheese.'); logger.info('Cheese is Gouda.'); logger.warn('Cheese is quite smelly.'); logger.error('Cheese is too ripe!'); logger.fatal('Cheese was breeding ground for listeria.');
// remember to change the require to just 'log4js' if you've npm install'ed it const log4js = require('../lib/log4js'); log4js.configure({ appenders: { cheese: { type: 'file', filename: 'cheese.log' } }, categories: { default: { appenders: ['cheese'], level: 'error' } }, }); const logger = log4js.getLogger('cheese'); logger.level = 'ERROR'; logger.trace('Entering cheese testing'); logger.debug('Got cheese.'); logger.info('Cheese is Gouda.'); logger.warn('Cheese is quite smelly.'); logger.error('Cheese is too ripe!'); logger.fatal('Cheese was breeding ground for listeria.');
-1
log4js-node/log4js-node
1,333
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one
lamweili
"2022-10-01T17:05:08Z"
"2022-10-01T17:20:06Z"
570ef530dc02d3e843a5421cb015bb8fadfe0b41
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one.
./lib/appenders/fileSync.js
const debug = require('debug')('log4js:fileSync'); const path = require('path'); const fs = require('fs'); const os = require('os'); const eol = os.EOL; function touchFile(file, options) { // attempt to create the directory const mkdir = (dir) => { try { return fs.mkdirSync(dir, { recursive: true }); } catch (e) { // backward-compatible fs.mkdirSync for nodejs pre-10.12.0 (without recursive option) // recursive creation of parent first if (e.code === 'ENOENT') { mkdir(path.dirname(dir)); return mkdir(dir); } // throw error for all except EEXIST and EROFS (read-only filesystem) if (e.code !== 'EEXIST' && e.code !== 'EROFS') { throw e; } // EEXIST: throw if file and not directory // EROFS : throw if directory not found else { try { if (fs.statSync(dir).isDirectory()) { return dir; } throw e; } catch (err) { throw e; } } } }; mkdir(path.dirname(file)); // try to throw EISDIR, EROFS, EACCES fs.appendFileSync(file, '', { mode: options.mode, flag: options.flags }); } class RollingFileSync { constructor(filename, maxLogSize, backups, options) { debug('In RollingFileStream'); if (maxLogSize < 0) { throw new Error(`maxLogSize (${maxLogSize}) should be > 0`); } this.filename = filename; this.size = maxLogSize; this.backups = backups; this.options = options; this.currentSize = 0; function currentFileSize(file) { let fileSize = 0; try { fileSize = fs.statSync(file).size; } catch (e) { // file does not exist touchFile(file, options); } return fileSize; } this.currentSize = currentFileSize(this.filename); } shouldRoll() { debug( 'should roll with current size %d, and max size %d', this.currentSize, this.size ); return this.currentSize >= this.size; } roll(filename) { const that = this; const nameMatcher = new RegExp(`^${path.basename(filename)}`); function justTheseFiles(item) { return nameMatcher.test(item); } function index(filename_) { return ( parseInt(filename_.slice(`${path.basename(filename)}.`.length), 10) || 0 ); } function byIndex(a, b) { return index(a) - index(b); } function increaseFileIndex(fileToRename) { const idx = index(fileToRename); debug(`Index of ${fileToRename} is ${idx}`); if (that.backups === 0) { fs.truncateSync(filename, 0); } else if (idx < that.backups) { // on windows, you can get a EEXIST error if you rename a file to an existing file // so, we'll try to delete the file we're renaming to first try { fs.unlinkSync(`${filename}.${idx + 1}`); } catch (e) { // ignore err: if we could not delete, it's most likely that it doesn't exist } debug(`Renaming ${fileToRename} -> ${filename}.${idx + 1}`); fs.renameSync( path.join(path.dirname(filename), fileToRename), `${filename}.${idx + 1}` ); } } function renameTheFiles() { // roll the backups (rename file.n to file.n+1, where n <= numBackups) debug('Renaming the old files'); const files = fs.readdirSync(path.dirname(filename)); files .filter(justTheseFiles) .sort(byIndex) .reverse() .forEach(increaseFileIndex); } debug('Rolling, rolling, rolling'); renameTheFiles(); } // eslint-disable-next-line no-unused-vars write(chunk, encoding) { const that = this; function writeTheChunk() { debug('writing the chunk to the file'); that.currentSize += chunk.length; fs.appendFileSync(that.filename, chunk); } debug('in write'); if (this.shouldRoll()) { this.currentSize = 0; this.roll(this.filename); } writeTheChunk(); } } /** * File Appender writing the logs to a text file. Supports rolling of logs by size. * * @param file the file log messages will be written to * @param layout a function that takes a logevent and returns a string * (defaults to basicLayout). * @param logSize - the maximum size (in bytes) for a log file, * if not provided then logs won't be rotated. * @param numBackups - the number of log files to keep after logSize * has been reached (default 5) * @param options - options to be passed to the underlying stream * @param timezoneOffset - optional timezone offset in minutes (default system local) */ function fileAppender( file, layout, logSize, numBackups, options, timezoneOffset ) { if (typeof file !== 'string' || file.length === 0) { throw new Error(`Invalid filename: ${file}`); } else if (file.endsWith(path.sep)) { throw new Error(`Filename is a directory: ${file}`); } else { // handle ~ expansion: https://github.com/nodejs/node/issues/684 // exclude ~ and ~filename as these can be valid files file = file.replace(new RegExp(`^~(?=${path.sep}.+)`), os.homedir()); } file = path.normalize(file); numBackups = !numBackups && numBackups !== 0 ? 5 : numBackups; debug( 'Creating fileSync appender (', file, ', ', logSize, ', ', numBackups, ', ', options, ', ', timezoneOffset, ')' ); function openTheStream(filePath, fileSize, numFiles) { let stream; if (fileSize) { stream = new RollingFileSync(filePath, fileSize, numFiles, options); } else { stream = ((f) => { // touch the file to apply flags (like w to truncate the file) touchFile(f, options); return { write(data) { fs.appendFileSync(f, data); }, }; })(filePath); } return stream; } const logFile = openTheStream(file, logSize, numBackups); return (loggingEvent) => { logFile.write(layout(loggingEvent, timezoneOffset) + eol); }; } function configure(config, layouts) { let layout = layouts.basicLayout; if (config.layout) { layout = layouts.layout(config.layout.type, config.layout); } const options = { flags: config.flags || 'a', encoding: config.encoding || 'utf8', mode: config.mode || 0o600, }; return fileAppender( config.filename, layout, config.maxLogSize, config.backups, options, config.timezoneOffset ); } module.exports.configure = configure;
const debug = require('debug')('log4js:fileSync'); const path = require('path'); const fs = require('fs'); const os = require('os'); const eol = os.EOL; function touchFile(file, options) { // attempt to create the directory const mkdir = (dir) => { try { return fs.mkdirSync(dir, { recursive: true }); } catch (e) { // backward-compatible fs.mkdirSync for nodejs pre-10.12.0 (without recursive option) // recursive creation of parent first if (e.code === 'ENOENT') { mkdir(path.dirname(dir)); return mkdir(dir); } // throw error for all except EEXIST and EROFS (read-only filesystem) if (e.code !== 'EEXIST' && e.code !== 'EROFS') { throw e; } // EEXIST: throw if file and not directory // EROFS : throw if directory not found else { try { if (fs.statSync(dir).isDirectory()) { return dir; } throw e; } catch (err) { throw e; } } } }; mkdir(path.dirname(file)); // try to throw EISDIR, EROFS, EACCES fs.appendFileSync(file, '', { mode: options.mode, flag: options.flags }); } class RollingFileSync { constructor(filename, maxLogSize, backups, options) { debug('In RollingFileStream'); if (maxLogSize < 0) { throw new Error(`maxLogSize (${maxLogSize}) should be > 0`); } this.filename = filename; this.size = maxLogSize; this.backups = backups; this.options = options; this.currentSize = 0; function currentFileSize(file) { let fileSize = 0; try { fileSize = fs.statSync(file).size; } catch (e) { // file does not exist touchFile(file, options); } return fileSize; } this.currentSize = currentFileSize(this.filename); } shouldRoll() { debug( 'should roll with current size %d, and max size %d', this.currentSize, this.size ); return this.currentSize >= this.size; } roll(filename) { const that = this; const nameMatcher = new RegExp(`^${path.basename(filename)}`); function justTheseFiles(item) { return nameMatcher.test(item); } function index(filename_) { return ( parseInt(filename_.slice(`${path.basename(filename)}.`.length), 10) || 0 ); } function byIndex(a, b) { return index(a) - index(b); } function increaseFileIndex(fileToRename) { const idx = index(fileToRename); debug(`Index of ${fileToRename} is ${idx}`); if (that.backups === 0) { fs.truncateSync(filename, 0); } else if (idx < that.backups) { // on windows, you can get a EEXIST error if you rename a file to an existing file // so, we'll try to delete the file we're renaming to first try { fs.unlinkSync(`${filename}.${idx + 1}`); } catch (e) { // ignore err: if we could not delete, it's most likely that it doesn't exist } debug(`Renaming ${fileToRename} -> ${filename}.${idx + 1}`); fs.renameSync( path.join(path.dirname(filename), fileToRename), `${filename}.${idx + 1}` ); } } function renameTheFiles() { // roll the backups (rename file.n to file.n+1, where n <= numBackups) debug('Renaming the old files'); const files = fs.readdirSync(path.dirname(filename)); files .filter(justTheseFiles) .sort(byIndex) .reverse() .forEach(increaseFileIndex); } debug('Rolling, rolling, rolling'); renameTheFiles(); } // eslint-disable-next-line no-unused-vars write(chunk, encoding) { const that = this; function writeTheChunk() { debug('writing the chunk to the file'); that.currentSize += chunk.length; fs.appendFileSync(that.filename, chunk); } debug('in write'); if (this.shouldRoll()) { this.currentSize = 0; this.roll(this.filename); } writeTheChunk(); } } /** * File Appender writing the logs to a text file. Supports rolling of logs by size. * * @param file the file log messages will be written to * @param layout a function that takes a logevent and returns a string * (defaults to basicLayout). * @param logSize - the maximum size (in bytes) for a log file, * if not provided then logs won't be rotated. * @param numBackups - the number of log files to keep after logSize * has been reached (default 5) * @param options - options to be passed to the underlying stream * @param timezoneOffset - optional timezone offset in minutes (default system local) */ function fileAppender( file, layout, logSize, numBackups, options, timezoneOffset ) { if (typeof file !== 'string' || file.length === 0) { throw new Error(`Invalid filename: ${file}`); } else if (file.endsWith(path.sep)) { throw new Error(`Filename is a directory: ${file}`); } else { // handle ~ expansion: https://github.com/nodejs/node/issues/684 // exclude ~ and ~filename as these can be valid files file = file.replace(new RegExp(`^~(?=${path.sep}.+)`), os.homedir()); } file = path.normalize(file); numBackups = !numBackups && numBackups !== 0 ? 5 : numBackups; debug( 'Creating fileSync appender (', file, ', ', logSize, ', ', numBackups, ', ', options, ', ', timezoneOffset, ')' ); function openTheStream(filePath, fileSize, numFiles) { let stream; if (fileSize) { stream = new RollingFileSync(filePath, fileSize, numFiles, options); } else { stream = ((f) => { // touch the file to apply flags (like w to truncate the file) touchFile(f, options); return { write(data) { fs.appendFileSync(f, data); }, }; })(filePath); } return stream; } const logFile = openTheStream(file, logSize, numBackups); return (loggingEvent) => { logFile.write(layout(loggingEvent, timezoneOffset) + eol); }; } function configure(config, layouts) { let layout = layouts.basicLayout; if (config.layout) { layout = layouts.layout(config.layout.type, config.layout); } const options = { flags: config.flags || 'a', encoding: config.encoding || 'utf8', mode: config.mode || 0o600, }; return fileAppender( config.filename, layout, config.maxLogSize, config.backups, options, config.timezoneOffset ); } module.exports.configure = configure;
-1
log4js-node/log4js-node
1,333
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one
lamweili
"2022-10-01T17:05:08Z"
"2022-10-01T17:20:06Z"
570ef530dc02d3e843a5421cb015bb8fadfe0b41
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one.
./test/tap/configuration-validation-test.js
const { test } = require('tap'); const util = require('util'); const path = require('path'); const sandbox = require('@log4js-node/sandboxed-module'); const debug = require('debug')('log4js:test.configuration-validation'); const deepFreeze = require('deep-freeze'); const fs = require('fs'); const log4js = require('../../lib/log4js'); const configuration = require('../../lib/configuration'); const removeFiles = async (filenames) => { if (!Array.isArray(filenames)) filenames = [filenames]; const promises = filenames.map((filename) => fs.promises.unlink(filename)); await Promise.allSettled(promises); }; const testAppender = (label, result) => ({ configure(config, layouts, findAppender) { debug( `testAppender(${label}).configure called, with config: ${util.inspect( config )}` ); result.configureCalled = true; result.type = config.type; result.label = label; result.config = config; result.layouts = layouts; result.findAppender = findAppender; return {}; }, }); test('log4js configuration validation', (batch) => { batch.test('should give error if config is just plain silly', (t) => { [null, undefined, '', ' ', []].forEach((config) => { const expectedError = new Error( `Problem with log4js configuration: (${util.inspect( config )}) - must be an object.` ); t.throws(() => configuration.configure(config), expectedError); }); t.end(); }); batch.test('should give error if config is an empty object', (t) => { t.throws( () => log4js.configure({}), '- must have a property "appenders" of type object.' ); t.end(); }); batch.test('should give error if config has no appenders', (t) => { t.throws( () => log4js.configure({ categories: {} }), '- must have a property "appenders" of type object.' ); t.end(); }); batch.test('should give error if config has no categories', (t) => { t.throws( () => log4js.configure({ appenders: { out: { type: 'stdout' } } }), '- must have a property "categories" of type object.' ); t.end(); }); batch.test('should give error if appenders is not an object', (t) => { t.throws( () => log4js.configure({ appenders: [], categories: [] }), '- must have a property "appenders" of type object.' ); t.end(); }); batch.test('should give error if appenders are not all valid', (t) => { t.throws( () => log4js.configure({ appenders: { thing: 'cheese' }, categories: {} }), '- appender "thing" is not valid (must be an object with property "type")' ); t.end(); }); batch.test('should require at least one appender', (t) => { t.throws( () => log4js.configure({ appenders: {}, categories: {} }), '- must define at least one appender.' ); t.end(); }); batch.test('should give error if categories are not all valid', (t) => { t.throws( () => log4js.configure({ appenders: { stdout: { type: 'stdout' } }, categories: { thing: 'cheese' }, }), '- category "thing" is not valid (must be an object with properties "appenders" and "level")' ); t.end(); }); batch.test('should give error if default category not defined', (t) => { t.throws( () => log4js.configure({ appenders: { stdout: { type: 'stdout' } }, categories: { thing: { appenders: ['stdout'], level: 'ERROR' } }, }), '- must define a "default" category.' ); t.end(); }); batch.test('should require at least one category', (t) => { t.throws( () => log4js.configure({ appenders: { stdout: { type: 'stdout' } }, categories: {}, }), '- must define at least one category.' ); t.end(); }); batch.test('should give error if category.appenders is not an array', (t) => { t.throws( () => log4js.configure({ appenders: { stdout: { type: 'stdout' } }, categories: { thing: { appenders: {}, level: 'ERROR' } }, }), '- category "thing" is not valid (appenders must be an array of appender names)' ); t.end(); }); batch.test('should give error if category.appenders is empty', (t) => { t.throws( () => log4js.configure({ appenders: { stdout: { type: 'stdout' } }, categories: { thing: { appenders: [], level: 'ERROR' } }, }), '- category "thing" is not valid (appenders must contain at least one appender name)' ); t.end(); }); batch.test( 'should give error if categories do not refer to valid appenders', (t) => { t.throws( () => log4js.configure({ appenders: { stdout: { type: 'stdout' } }, categories: { thing: { appenders: ['cheese'], level: 'ERROR' } }, }), '- category "thing" is not valid (appender "cheese" is not defined)' ); t.end(); } ); batch.test('should give error if category level is not valid', (t) => { t.throws( () => log4js.configure({ appenders: { stdout: { type: 'stdout' } }, categories: { default: { appenders: ['stdout'], level: 'Biscuits' } }, }), '- category "default" is not valid (level "Biscuits" not recognised; valid levels are ALL, TRACE' ); t.end(); }); batch.test( 'should give error if category enableCallStack is not valid', (t) => { t.throws( () => log4js.configure({ appenders: { stdout: { type: 'stdout' } }, categories: { default: { appenders: ['stdout'], level: 'Debug', enableCallStack: '123', }, }, }), '- category "default" is not valid (enableCallStack must be boolean type)' ); t.end(); } ); batch.test('should give error if appender type cannot be found', (t) => { t.throws( () => log4js.configure({ appenders: { thing: { type: 'cheese' } }, categories: { default: { appenders: ['thing'], level: 'ERROR' } }, }), '- appender "thing" is not valid (type "cheese" could not be found)' ); t.end(); }); batch.test('should create appender instances', (t) => { const thing = {}; const sandboxedLog4js = sandbox.require('../../lib/log4js', { requires: { cheese: testAppender('cheesy', thing), }, ignoreMissing: true, }); sandboxedLog4js.configure({ appenders: { thing: { type: 'cheese' } }, categories: { default: { appenders: ['thing'], level: 'ERROR' } }, }); t.ok(thing.configureCalled); t.equal(thing.type, 'cheese'); t.end(); }); batch.test( 'should use provided appender instance if instance provided', (t) => { const thing = {}; const cheese = testAppender('cheesy', thing); const sandboxedLog4js = sandbox.require('../../lib/log4js', { ignoreMissing: true, }); sandboxedLog4js.configure({ appenders: { thing: { type: cheese } }, categories: { default: { appenders: ['thing'], level: 'ERROR' } }, }); t.ok(thing.configureCalled); t.same(thing.type, cheese); t.end(); } ); batch.test('should not throw error if configure object is freezed', (t) => { const testFile = 'test/tap/freeze-date-file-test'; t.teardown(async () => { await removeFiles(testFile); }); t.doesNotThrow(() => log4js.configure( deepFreeze({ appenders: { dateFile: { type: 'dateFile', filename: testFile, alwaysIncludePattern: false, }, }, categories: { default: { appenders: ['dateFile'], level: log4js.levels.ERROR }, }, }) ) ); log4js.shutdown(() => { t.end(); }); }); batch.test('should load appenders from core first', (t) => { const result = {}; const sandboxedLog4js = sandbox.require('../../lib/log4js', { requires: { './cheese': testAppender('correct', result), cheese: testAppender('wrong', result), }, ignoreMissing: true, }); sandboxedLog4js.configure({ appenders: { thing: { type: 'cheese' } }, categories: { default: { appenders: ['thing'], level: 'ERROR' } }, }); t.ok(result.configureCalled); t.equal(result.type, 'cheese'); t.equal(result.label, 'correct'); t.end(); }); batch.test( 'should load appenders relative to main file if not in core, or node_modules', (t) => { const result = {}; const mainPath = path.dirname(require.main.filename); const sandboxConfig = { ignoreMissing: true, requires: {}, }; sandboxConfig.requires[`${mainPath}/cheese`] = testAppender( 'correct', result ); // add this one, because when we're running coverage the main path is a bit different sandboxConfig.requires[ `${path.join(mainPath, '../../node_modules/nyc/bin/cheese')}` ] = testAppender('correct', result); // in tap v15, the main path is at root of log4js (run `DEBUG=log4js:appenders npm test > /dev/null` to check) sandboxConfig.requires[`${path.join(mainPath, '../../cheese')}`] = testAppender('correct', result); // in node v6, there's an extra layer of node modules for some reason, so add this one to work around it sandboxConfig.requires[ `${path.join( mainPath, '../../node_modules/tap/node_modules/nyc/bin/cheese' )}` ] = testAppender('correct', result); const sandboxedLog4js = sandbox.require( '../../lib/log4js', sandboxConfig ); sandboxedLog4js.configure({ appenders: { thing: { type: 'cheese' } }, categories: { default: { appenders: ['thing'], level: 'ERROR' } }, }); t.ok(result.configureCalled); t.equal(result.type, 'cheese'); t.equal(result.label, 'correct'); t.end(); } ); batch.test( 'should load appenders relative to process.cwd if not found in core, node_modules', (t) => { const result = {}; const fakeProcess = new Proxy(process, { get(target, key) { if (key === 'cwd') { return () => '/var/lib/cheese'; } return target[key]; }, }); // windows file paths are different to unix, so let's make this work for both. const requires = {}; requires[path.join('/var', 'lib', 'cheese', 'cheese')] = testAppender( 'correct', result ); const sandboxedLog4js = sandbox.require('../../lib/log4js', { ignoreMissing: true, requires, globals: { process: fakeProcess, }, }); sandboxedLog4js.configure({ appenders: { thing: { type: 'cheese' } }, categories: { default: { appenders: ['thing'], level: 'ERROR' } }, }); t.ok(result.configureCalled); t.equal(result.type, 'cheese'); t.equal(result.label, 'correct'); t.end(); } ); batch.test('should pass config, layout, findAppender to appenders', (t) => { const result = {}; const sandboxedLog4js = sandbox.require('../../lib/log4js', { ignoreMissing: true, requires: { cheese: testAppender('cheesy', result), notCheese: testAppender('notCheesy', {}), }, }); sandboxedLog4js.configure({ appenders: { thing: { type: 'cheese', foo: 'bar' }, thing2: { type: 'notCheese' }, }, categories: { default: { appenders: ['thing'], level: 'ERROR' } }, }); t.ok(result.configureCalled); t.equal(result.type, 'cheese'); t.equal(result.config.foo, 'bar'); t.type(result.layouts, 'object'); t.type(result.layouts.basicLayout, 'function'); t.type(result.findAppender, 'function'); t.type(result.findAppender('thing2'), 'object'); t.end(); }); batch.test( 'should not give error if level object is used instead of string', (t) => { t.doesNotThrow(() => log4js.configure({ appenders: { thing: { type: 'stdout' } }, categories: { default: { appenders: ['thing'], level: log4js.levels.ERROR }, }, }) ); t.end(); } ); batch.test( 'should not create appender instance if not used in categories', (t) => { const used = {}; const notUsed = {}; const sandboxedLog4js = sandbox.require('../../lib/log4js', { requires: { cat: testAppender('meow', used), dog: testAppender('woof', notUsed), }, ignoreMissing: true, }); sandboxedLog4js.configure({ appenders: { used: { type: 'cat' }, notUsed: { type: 'dog' } }, categories: { default: { appenders: ['used'], level: 'ERROR' } }, }); t.ok(used.configureCalled); t.notOk(notUsed.configureCalled); t.end(); } ); batch.end(); });
const { test } = require('tap'); const util = require('util'); const path = require('path'); const sandbox = require('@log4js-node/sandboxed-module'); const debug = require('debug')('log4js:test.configuration-validation'); const deepFreeze = require('deep-freeze'); const fs = require('fs'); const log4js = require('../../lib/log4js'); const configuration = require('../../lib/configuration'); const removeFiles = async (filenames) => { if (!Array.isArray(filenames)) filenames = [filenames]; const promises = filenames.map((filename) => fs.promises.unlink(filename)); await Promise.allSettled(promises); }; const testAppender = (label, result) => ({ configure(config, layouts, findAppender) { debug( `testAppender(${label}).configure called, with config: ${util.inspect( config )}` ); result.configureCalled = true; result.type = config.type; result.label = label; result.config = config; result.layouts = layouts; result.findAppender = findAppender; return {}; }, }); test('log4js configuration validation', (batch) => { batch.test('should give error if config is just plain silly', (t) => { [null, undefined, '', ' ', []].forEach((config) => { const expectedError = new Error( `Problem with log4js configuration: (${util.inspect( config )}) - must be an object.` ); t.throws(() => configuration.configure(config), expectedError); }); t.end(); }); batch.test('should give error if config is an empty object', (t) => { t.throws( () => log4js.configure({}), '- must have a property "appenders" of type object.' ); t.end(); }); batch.test('should give error if config has no appenders', (t) => { t.throws( () => log4js.configure({ categories: {} }), '- must have a property "appenders" of type object.' ); t.end(); }); batch.test('should give error if config has no categories', (t) => { t.throws( () => log4js.configure({ appenders: { out: { type: 'stdout' } } }), '- must have a property "categories" of type object.' ); t.end(); }); batch.test('should give error if appenders is not an object', (t) => { t.throws( () => log4js.configure({ appenders: [], categories: [] }), '- must have a property "appenders" of type object.' ); t.end(); }); batch.test('should give error if appenders are not all valid', (t) => { t.throws( () => log4js.configure({ appenders: { thing: 'cheese' }, categories: {} }), '- appender "thing" is not valid (must be an object with property "type")' ); t.end(); }); batch.test('should require at least one appender', (t) => { t.throws( () => log4js.configure({ appenders: {}, categories: {} }), '- must define at least one appender.' ); t.end(); }); batch.test('should give error if categories are not all valid', (t) => { t.throws( () => log4js.configure({ appenders: { stdout: { type: 'stdout' } }, categories: { thing: 'cheese' }, }), '- category "thing" is not valid (must be an object with properties "appenders" and "level")' ); t.end(); }); batch.test('should give error if default category not defined', (t) => { t.throws( () => log4js.configure({ appenders: { stdout: { type: 'stdout' } }, categories: { thing: { appenders: ['stdout'], level: 'ERROR' } }, }), '- must define a "default" category.' ); t.end(); }); batch.test('should require at least one category', (t) => { t.throws( () => log4js.configure({ appenders: { stdout: { type: 'stdout' } }, categories: {}, }), '- must define at least one category.' ); t.end(); }); batch.test('should give error if category.appenders is not an array', (t) => { t.throws( () => log4js.configure({ appenders: { stdout: { type: 'stdout' } }, categories: { thing: { appenders: {}, level: 'ERROR' } }, }), '- category "thing" is not valid (appenders must be an array of appender names)' ); t.end(); }); batch.test('should give error if category.appenders is empty', (t) => { t.throws( () => log4js.configure({ appenders: { stdout: { type: 'stdout' } }, categories: { thing: { appenders: [], level: 'ERROR' } }, }), '- category "thing" is not valid (appenders must contain at least one appender name)' ); t.end(); }); batch.test( 'should give error if categories do not refer to valid appenders', (t) => { t.throws( () => log4js.configure({ appenders: { stdout: { type: 'stdout' } }, categories: { thing: { appenders: ['cheese'], level: 'ERROR' } }, }), '- category "thing" is not valid (appender "cheese" is not defined)' ); t.end(); } ); batch.test('should give error if category level is not valid', (t) => { t.throws( () => log4js.configure({ appenders: { stdout: { type: 'stdout' } }, categories: { default: { appenders: ['stdout'], level: 'Biscuits' } }, }), '- category "default" is not valid (level "Biscuits" not recognised; valid levels are ALL, TRACE' ); t.end(); }); batch.test( 'should give error if category enableCallStack is not valid', (t) => { t.throws( () => log4js.configure({ appenders: { stdout: { type: 'stdout' } }, categories: { default: { appenders: ['stdout'], level: 'Debug', enableCallStack: '123', }, }, }), '- category "default" is not valid (enableCallStack must be boolean type)' ); t.end(); } ); batch.test('should give error if appender type cannot be found', (t) => { t.throws( () => log4js.configure({ appenders: { thing: { type: 'cheese' } }, categories: { default: { appenders: ['thing'], level: 'ERROR' } }, }), '- appender "thing" is not valid (type "cheese" could not be found)' ); t.end(); }); batch.test('should create appender instances', (t) => { const thing = {}; const sandboxedLog4js = sandbox.require('../../lib/log4js', { requires: { cheese: testAppender('cheesy', thing), }, ignoreMissing: true, }); sandboxedLog4js.configure({ appenders: { thing: { type: 'cheese' } }, categories: { default: { appenders: ['thing'], level: 'ERROR' } }, }); t.ok(thing.configureCalled); t.equal(thing.type, 'cheese'); t.end(); }); batch.test( 'should use provided appender instance if instance provided', (t) => { const thing = {}; const cheese = testAppender('cheesy', thing); const sandboxedLog4js = sandbox.require('../../lib/log4js', { ignoreMissing: true, }); sandboxedLog4js.configure({ appenders: { thing: { type: cheese } }, categories: { default: { appenders: ['thing'], level: 'ERROR' } }, }); t.ok(thing.configureCalled); t.same(thing.type, cheese); t.end(); } ); batch.test('should not throw error if configure object is freezed', (t) => { const testFile = 'test/tap/freeze-date-file-test'; t.teardown(async () => { await removeFiles(testFile); }); t.doesNotThrow(() => log4js.configure( deepFreeze({ appenders: { dateFile: { type: 'dateFile', filename: testFile, alwaysIncludePattern: false, }, }, categories: { default: { appenders: ['dateFile'], level: log4js.levels.ERROR }, }, }) ) ); log4js.shutdown(() => { t.end(); }); }); batch.test('should load appenders from core first', (t) => { const result = {}; const sandboxedLog4js = sandbox.require('../../lib/log4js', { requires: { './cheese': testAppender('correct', result), cheese: testAppender('wrong', result), }, ignoreMissing: true, }); sandboxedLog4js.configure({ appenders: { thing: { type: 'cheese' } }, categories: { default: { appenders: ['thing'], level: 'ERROR' } }, }); t.ok(result.configureCalled); t.equal(result.type, 'cheese'); t.equal(result.label, 'correct'); t.end(); }); batch.test( 'should load appenders relative to main file if not in core, or node_modules', (t) => { const result = {}; const mainPath = path.dirname(require.main.filename); const sandboxConfig = { ignoreMissing: true, requires: {}, }; sandboxConfig.requires[`${mainPath}/cheese`] = testAppender( 'correct', result ); // add this one, because when we're running coverage the main path is a bit different sandboxConfig.requires[ `${path.join(mainPath, '../../node_modules/nyc/bin/cheese')}` ] = testAppender('correct', result); // in tap v15, the main path is at root of log4js (run `DEBUG=log4js:appenders npm test > /dev/null` to check) sandboxConfig.requires[`${path.join(mainPath, '../../cheese')}`] = testAppender('correct', result); // in node v6, there's an extra layer of node modules for some reason, so add this one to work around it sandboxConfig.requires[ `${path.join( mainPath, '../../node_modules/tap/node_modules/nyc/bin/cheese' )}` ] = testAppender('correct', result); const sandboxedLog4js = sandbox.require( '../../lib/log4js', sandboxConfig ); sandboxedLog4js.configure({ appenders: { thing: { type: 'cheese' } }, categories: { default: { appenders: ['thing'], level: 'ERROR' } }, }); t.ok(result.configureCalled); t.equal(result.type, 'cheese'); t.equal(result.label, 'correct'); t.end(); } ); batch.test( 'should load appenders relative to process.cwd if not found in core, node_modules', (t) => { const result = {}; const fakeProcess = new Proxy(process, { get(target, key) { if (key === 'cwd') { return () => '/var/lib/cheese'; } return target[key]; }, }); // windows file paths are different to unix, so let's make this work for both. const requires = {}; requires[path.join('/var', 'lib', 'cheese', 'cheese')] = testAppender( 'correct', result ); const sandboxedLog4js = sandbox.require('../../lib/log4js', { ignoreMissing: true, requires, globals: { process: fakeProcess, }, }); sandboxedLog4js.configure({ appenders: { thing: { type: 'cheese' } }, categories: { default: { appenders: ['thing'], level: 'ERROR' } }, }); t.ok(result.configureCalled); t.equal(result.type, 'cheese'); t.equal(result.label, 'correct'); t.end(); } ); batch.test('should pass config, layout, findAppender to appenders', (t) => { const result = {}; const sandboxedLog4js = sandbox.require('../../lib/log4js', { ignoreMissing: true, requires: { cheese: testAppender('cheesy', result), notCheese: testAppender('notCheesy', {}), }, }); sandboxedLog4js.configure({ appenders: { thing: { type: 'cheese', foo: 'bar' }, thing2: { type: 'notCheese' }, }, categories: { default: { appenders: ['thing'], level: 'ERROR' } }, }); t.ok(result.configureCalled); t.equal(result.type, 'cheese'); t.equal(result.config.foo, 'bar'); t.type(result.layouts, 'object'); t.type(result.layouts.basicLayout, 'function'); t.type(result.findAppender, 'function'); t.type(result.findAppender('thing2'), 'object'); t.end(); }); batch.test( 'should not give error if level object is used instead of string', (t) => { t.doesNotThrow(() => log4js.configure({ appenders: { thing: { type: 'stdout' } }, categories: { default: { appenders: ['thing'], level: log4js.levels.ERROR }, }, }) ); t.end(); } ); batch.test( 'should not create appender instance if not used in categories', (t) => { const used = {}; const notUsed = {}; const sandboxedLog4js = sandbox.require('../../lib/log4js', { requires: { cat: testAppender('meow', used), dog: testAppender('woof', notUsed), }, ignoreMissing: true, }); sandboxedLog4js.configure({ appenders: { used: { type: 'cat' }, notUsed: { type: 'dog' } }, categories: { default: { appenders: ['used'], level: 'ERROR' } }, }); t.ok(used.configureCalled); t.notOk(notUsed.configureCalled); t.end(); } ); batch.end(); });
-1
log4js-node/log4js-node
1,333
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one
lamweili
"2022-10-01T17:05:08Z"
"2022-10-01T17:20:06Z"
570ef530dc02d3e843a5421cb015bb8fadfe0b41
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one.
./test/tap/no-cluster-test.js
const { test } = require('tap'); const proxyquire = require('proxyquire'); test('clustering is disabled if cluster is not present', (t) => { const log4js = proxyquire('../../lib/log4js', { cluster: null }); const recorder = require('../../lib/appenders/recording'); log4js.configure({ appenders: { vcr: { type: 'recording' } }, categories: { default: { appenders: ['vcr'], level: 'debug' } }, }); log4js.getLogger().info('it should still work'); const events = recorder.replay(); t.equal(events[0].data[0], 'it should still work'); t.end(); });
const { test } = require('tap'); const proxyquire = require('proxyquire'); test('clustering is disabled if cluster is not present', (t) => { const log4js = proxyquire('../../lib/log4js', { cluster: null }); const recorder = require('../../lib/appenders/recording'); log4js.configure({ appenders: { vcr: { type: 'recording' } }, categories: { default: { appenders: ['vcr'], level: 'debug' } }, }); log4js.getLogger().info('it should still work'); const events = recorder.replay(); t.equal(events[0].data[0], 'it should still work'); t.end(); });
-1
log4js-node/log4js-node
1,333
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one
lamweili
"2022-10-01T17:05:08Z"
"2022-10-01T17:20:06Z"
570ef530dc02d3e843a5421cb015bb8fadfe0b41
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one.
./examples/patternLayout-tokens.js
const log4js = require('../lib/log4js'); log4js.configure({ appenders: { out: { type: 'console', layout: { type: 'pattern', pattern: '%[%r (%x{pid}) %p %c -%] %m%n', tokens: { pid: function () { return process.pid; }, }, }, }, }, categories: { default: { appenders: ['out'], level: 'info' }, }, }); const logger = log4js.getLogger('app'); logger.info('Test log message');
const log4js = require('../lib/log4js'); log4js.configure({ appenders: { out: { type: 'console', layout: { type: 'pattern', pattern: '%[%r (%x{pid}) %p %c -%] %m%n', tokens: { pid: function () { return process.pid; }, }, }, }, }, categories: { default: { appenders: ['out'], level: 'info' }, }, }); const logger = log4js.getLogger('app'); logger.info('Test log message');
-1
log4js-node/log4js-node
1,333
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one
lamweili
"2022-10-01T17:05:08Z"
"2022-10-01T17:20:06Z"
570ef530dc02d3e843a5421cb015bb8fadfe0b41
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one.
./test/tap/noLogFilter-test.js
const { test } = require('tap'); const log4js = require('../../lib/log4js'); const recording = require('../../lib/appenders/recording'); /** * test a simple regexp */ test('log4js noLogFilter', (batch) => { batch.beforeEach((done) => { recording.reset(); if (typeof done === 'function') { done(); } }); batch.test( 'appender should exclude events that match the regexp string', (t) => { log4js.configure({ appenders: { recorder: { type: 'recording' }, filtered: { type: 'noLogFilter', exclude: 'This.*not', appender: 'recorder', }, }, categories: { default: { appenders: ['filtered'], level: 'DEBUG' } }, }); const logger = log4js.getLogger(); logger.debug('This should not get logged'); logger.debug('This should get logged'); logger.debug( 'Another case that not match the regex, so it should get logged' ); const logEvents = recording.replay(); t.equal(logEvents.length, 2); t.equal(logEvents[0].data[0], 'This should get logged'); t.equal( logEvents[1].data[0], 'Another case that not match the regex, so it should get logged' ); t.end(); } ); /** * test an array of regexp */ batch.test( 'appender should exclude events that match the regexp string contained in the array', (t) => { log4js.configure({ appenders: { recorder: { type: 'recording' }, filtered: { type: 'noLogFilter', exclude: ['This.*not', 'instead'], appender: 'recorder', }, }, categories: { default: { appenders: ['filtered'], level: 'DEBUG' } }, }); const logger = log4js.getLogger(); logger.debug('This should not get logged'); logger.debug('This should get logged'); logger.debug( 'Another case that not match the regex, so it should get logged' ); logger.debug('This case instead it should get logged'); logger.debug('The last that should get logged'); const logEvents = recording.replay(); t.equal(logEvents.length, 3); t.equal(logEvents[0].data[0], 'This should get logged'); t.equal( logEvents[1].data[0], 'Another case that not match the regex, so it should get logged' ); t.equal(logEvents[2].data[0], 'The last that should get logged'); t.end(); } ); /** * test case insentitive regexp */ batch.test( 'appender should evaluate the regexp using incase sentitive option', (t) => { log4js.configure({ appenders: { recorder: { type: 'recording' }, filtered: { type: 'noLogFilter', exclude: ['NOT', 'eX.*de'], appender: 'recorder', }, }, categories: { default: { appenders: ['filtered'], level: 'DEBUG' } }, }); const logger = log4js.getLogger(); logger.debug('This should not get logged'); logger.debug('This should get logged'); logger.debug('Exclude this string'); logger.debug('Include this string'); const logEvents = recording.replay(); t.equal(logEvents.length, 2); t.equal(logEvents[0].data[0], 'This should get logged'); t.equal(logEvents[1].data[0], 'Include this string'); t.end(); } ); /** * test empty string or null regexp */ batch.test( 'appender should skip the match in case of empty or null regexp', (t) => { log4js.configure({ appenders: { recorder: { type: 'recording' }, filtered: { type: 'noLogFilter', exclude: ['', null, undefined], appender: 'recorder', }, }, categories: { default: { appenders: ['filtered'], level: 'DEBUG' } }, }); const logger = log4js.getLogger(); logger.debug('This should get logged'); logger.debug('Another string that should get logged'); const logEvents = recording.replay(); t.equal(logEvents.length, 2); t.equal(logEvents[0].data[0], 'This should get logged'); t.equal(logEvents[1].data[0], 'Another string that should get logged'); t.end(); } ); /** * test for excluding all the events that contains digits */ batch.test('appender should exclude the events that contains digits', (t) => { log4js.configure({ appenders: { recorder: { type: 'recording' }, filtered: { type: 'noLogFilter', exclude: '\\d', appender: 'recorder', }, }, categories: { default: { appenders: ['filtered'], level: 'DEBUG' } }, }); const logger = log4js.getLogger(); logger.debug('This should get logged'); logger.debug('The 2nd event should not get logged'); logger.debug('The 3rd event should not get logged, such as the 2nd'); const logEvents = recording.replay(); t.equal(logEvents.length, 1); t.equal(logEvents[0].data[0], 'This should get logged'); t.end(); }); /** * test the cases provided in the documentation * https://log4js-node.github.io/log4js-node/noLogFilter.html */ batch.test( 'appender should exclude not valid events according to the documentation', (t) => { log4js.configure({ appenders: { recorder: { type: 'recording' }, filtered: { type: 'noLogFilter', exclude: ['NOT', '\\d', ''], appender: 'recorder', }, }, categories: { default: { appenders: ['filtered'], level: 'DEBUG' } }, }); const logger = log4js.getLogger(); logger.debug('I will be logged in all-the-logs.log'); logger.debug('I will be not logged in all-the-logs.log'); logger.debug('A 2nd message that will be excluded in all-the-logs.log'); logger.debug('Hello again'); const logEvents = recording.replay(); t.equal(logEvents.length, 2); t.equal(logEvents[0].data[0], 'I will be logged in all-the-logs.log'); t.equal(logEvents[1].data[0], 'Hello again'); t.end(); } ); batch.end(); });
const { test } = require('tap'); const log4js = require('../../lib/log4js'); const recording = require('../../lib/appenders/recording'); /** * test a simple regexp */ test('log4js noLogFilter', (batch) => { batch.beforeEach((done) => { recording.reset(); if (typeof done === 'function') { done(); } }); batch.test( 'appender should exclude events that match the regexp string', (t) => { log4js.configure({ appenders: { recorder: { type: 'recording' }, filtered: { type: 'noLogFilter', exclude: 'This.*not', appender: 'recorder', }, }, categories: { default: { appenders: ['filtered'], level: 'DEBUG' } }, }); const logger = log4js.getLogger(); logger.debug('This should not get logged'); logger.debug('This should get logged'); logger.debug( 'Another case that not match the regex, so it should get logged' ); const logEvents = recording.replay(); t.equal(logEvents.length, 2); t.equal(logEvents[0].data[0], 'This should get logged'); t.equal( logEvents[1].data[0], 'Another case that not match the regex, so it should get logged' ); t.end(); } ); /** * test an array of regexp */ batch.test( 'appender should exclude events that match the regexp string contained in the array', (t) => { log4js.configure({ appenders: { recorder: { type: 'recording' }, filtered: { type: 'noLogFilter', exclude: ['This.*not', 'instead'], appender: 'recorder', }, }, categories: { default: { appenders: ['filtered'], level: 'DEBUG' } }, }); const logger = log4js.getLogger(); logger.debug('This should not get logged'); logger.debug('This should get logged'); logger.debug( 'Another case that not match the regex, so it should get logged' ); logger.debug('This case instead it should get logged'); logger.debug('The last that should get logged'); const logEvents = recording.replay(); t.equal(logEvents.length, 3); t.equal(logEvents[0].data[0], 'This should get logged'); t.equal( logEvents[1].data[0], 'Another case that not match the regex, so it should get logged' ); t.equal(logEvents[2].data[0], 'The last that should get logged'); t.end(); } ); /** * test case insentitive regexp */ batch.test( 'appender should evaluate the regexp using incase sentitive option', (t) => { log4js.configure({ appenders: { recorder: { type: 'recording' }, filtered: { type: 'noLogFilter', exclude: ['NOT', 'eX.*de'], appender: 'recorder', }, }, categories: { default: { appenders: ['filtered'], level: 'DEBUG' } }, }); const logger = log4js.getLogger(); logger.debug('This should not get logged'); logger.debug('This should get logged'); logger.debug('Exclude this string'); logger.debug('Include this string'); const logEvents = recording.replay(); t.equal(logEvents.length, 2); t.equal(logEvents[0].data[0], 'This should get logged'); t.equal(logEvents[1].data[0], 'Include this string'); t.end(); } ); /** * test empty string or null regexp */ batch.test( 'appender should skip the match in case of empty or null regexp', (t) => { log4js.configure({ appenders: { recorder: { type: 'recording' }, filtered: { type: 'noLogFilter', exclude: ['', null, undefined], appender: 'recorder', }, }, categories: { default: { appenders: ['filtered'], level: 'DEBUG' } }, }); const logger = log4js.getLogger(); logger.debug('This should get logged'); logger.debug('Another string that should get logged'); const logEvents = recording.replay(); t.equal(logEvents.length, 2); t.equal(logEvents[0].data[0], 'This should get logged'); t.equal(logEvents[1].data[0], 'Another string that should get logged'); t.end(); } ); /** * test for excluding all the events that contains digits */ batch.test('appender should exclude the events that contains digits', (t) => { log4js.configure({ appenders: { recorder: { type: 'recording' }, filtered: { type: 'noLogFilter', exclude: '\\d', appender: 'recorder', }, }, categories: { default: { appenders: ['filtered'], level: 'DEBUG' } }, }); const logger = log4js.getLogger(); logger.debug('This should get logged'); logger.debug('The 2nd event should not get logged'); logger.debug('The 3rd event should not get logged, such as the 2nd'); const logEvents = recording.replay(); t.equal(logEvents.length, 1); t.equal(logEvents[0].data[0], 'This should get logged'); t.end(); }); /** * test the cases provided in the documentation * https://log4js-node.github.io/log4js-node/noLogFilter.html */ batch.test( 'appender should exclude not valid events according to the documentation', (t) => { log4js.configure({ appenders: { recorder: { type: 'recording' }, filtered: { type: 'noLogFilter', exclude: ['NOT', '\\d', ''], appender: 'recorder', }, }, categories: { default: { appenders: ['filtered'], level: 'DEBUG' } }, }); const logger = log4js.getLogger(); logger.debug('I will be logged in all-the-logs.log'); logger.debug('I will be not logged in all-the-logs.log'); logger.debug('A 2nd message that will be excluded in all-the-logs.log'); logger.debug('Hello again'); const logEvents = recording.replay(); t.equal(logEvents.length, 2); t.equal(logEvents[0].data[0], 'I will be logged in all-the-logs.log'); t.equal(logEvents[1].data[0], 'Hello again'); t.end(); } ); batch.end(); });
-1
log4js-node/log4js-node
1,333
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one
lamweili
"2022-10-01T17:05:08Z"
"2022-10-01T17:20:06Z"
570ef530dc02d3e843a5421cb015bb8fadfe0b41
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one.
./test/tap/disable-cluster-test.js
const { test } = require('tap'); const cluster = require('cluster'); const log4js = require('../../lib/log4js'); const recorder = require('../../lib/appenders/recording'); cluster.removeAllListeners(); log4js.configure({ appenders: { vcr: { type: 'recording' }, }, categories: { default: { appenders: ['vcr'], level: 'debug' } }, disableClustering: true, }); if (cluster.isMaster) { cluster.fork(); const masterLogger = log4js.getLogger('master'); const masterPid = process.pid; masterLogger.info('this is master'); cluster.on('exit', () => { const logEvents = recorder.replay(); test('cluster master', (batch) => { batch.test('only master events should be logged', (t) => { t.equal(logEvents.length, 1); t.equal(logEvents[0].categoryName, 'master'); t.equal(logEvents[0].pid, masterPid); t.equal(logEvents[0].data[0], 'this is master'); t.end(); }); batch.end(); }); }); } else { const workerLogger = log4js.getLogger('worker'); workerLogger.info('this is worker', new Error('oh dear')); const workerEvents = recorder.replay(); test('cluster worker', (batch) => { batch.test('should send events to its own appender', (t) => { t.equal(workerEvents.length, 1); t.equal(workerEvents[0].categoryName, 'worker'); t.equal(workerEvents[0].data[0], 'this is worker'); t.type(workerEvents[0].data[1], 'Error'); t.match(workerEvents[0].data[1].stack, 'Error: oh dear'); t.end(); }); batch.end(); }); // test sending a cluster-style log message process.send({ topic: 'log4js:message', data: { cheese: 'gouda' } }); cluster.worker.disconnect(); }
const { test } = require('tap'); const cluster = require('cluster'); const log4js = require('../../lib/log4js'); const recorder = require('../../lib/appenders/recording'); cluster.removeAllListeners(); log4js.configure({ appenders: { vcr: { type: 'recording' }, }, categories: { default: { appenders: ['vcr'], level: 'debug' } }, disableClustering: true, }); if (cluster.isMaster) { cluster.fork(); const masterLogger = log4js.getLogger('master'); const masterPid = process.pid; masterLogger.info('this is master'); cluster.on('exit', () => { const logEvents = recorder.replay(); test('cluster master', (batch) => { batch.test('only master events should be logged', (t) => { t.equal(logEvents.length, 1); t.equal(logEvents[0].categoryName, 'master'); t.equal(logEvents[0].pid, masterPid); t.equal(logEvents[0].data[0], 'this is master'); t.end(); }); batch.end(); }); }); } else { const workerLogger = log4js.getLogger('worker'); workerLogger.info('this is worker', new Error('oh dear')); const workerEvents = recorder.replay(); test('cluster worker', (batch) => { batch.test('should send events to its own appender', (t) => { t.equal(workerEvents.length, 1); t.equal(workerEvents[0].categoryName, 'worker'); t.equal(workerEvents[0].data[0], 'this is worker'); t.type(workerEvents[0].data[1], 'Error'); t.match(workerEvents[0].data[1].stack, 'Error: oh dear'); t.end(); }); batch.end(); }); // test sending a cluster-style log message process.send({ topic: 'log4js:message', data: { cheese: 'gouda' } }); cluster.worker.disconnect(); }
-1
log4js-node/log4js-node
1,333
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one
lamweili
"2022-10-01T17:05:08Z"
"2022-10-01T17:20:06Z"
570ef530dc02d3e843a5421cb015bb8fadfe0b41
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one.
./docs/writing-appenders.md
# Writing Appenders for Log4js Log4js can load appenders from outside its core set. To add a custom appender, the easiest way is to make it a stand-alone module and publish to npm. You can also load appenders from your own application, but they must be defined in a module. ## Loading mechanism When log4js parses your configuration, it loops through the defined appenders. For each one, it will `require` the appender initially using the `type` value prepended with './appenders' as the module identifier - this is to try loading from the core appenders first. If that fails (the module could not be found in the core appenders), then log4js will try to require the module using variations of the `type` value. Log4js checks the following places (in this order) for appenders based on the type value: 1. Bundled core appenders (within appenders directory): `require('./' + type)` 2. node_modules: `require(type)` 3. relative to the main file of your application: `require(path.dirname(require.main.filename) + '/' + type)` 4. relative to the process' current working directory: `require(process.cwd() + '/' + type)` If that fails, an error will be raised. ## Appender Modules An appender module should export a single function called `configure`. The function should accept the following arguments: - `config` - `object` - the appender's configuration object - `layouts` - `module` - gives access to the [layouts](layouts.md) module, which most appenders will need - `layout` - `function(type, config)` - this is the main function that appenders will use to find a layout - `findAppender` - `function(name)` - if your appender is a wrapper around another appender (like the [logLevelFilter](logLevelFilter.md) for example), this function can be used to find another appender by name - `levels` - `module` - gives access to the [levels](levels.md) module, which most appenders will need `configure` should return a function which accepts a logEvent, which is the appender itself. One of the simplest examples is the [stdout](stdout.md) appender. Let's run through the code. ## Example ```javascript // This is the function that generates an appender function function stdoutAppender(layout, timezoneOffset) { // This is the appender function itself return (loggingEvent) => { process.stdout.write(`${layout(loggingEvent, timezoneOffset)}\n`); }; } // stdout configure doesn't need to use findAppender, or levels function configure(config, layouts) { // the default layout for the appender let layout = layouts.colouredLayout; // check if there is another layout specified if (config.layout) { // load the layout layout = layouts.layout(config.layout.type, config.layout); } //create a new appender instance return stdoutAppender(layout, config.timezoneOffset); } //export the only function needed exports.configure = configure; ``` # Shutdown functions It's a good idea to implement a `shutdown` function on your appender instances. This function will get called by `log4js.shutdown` and signals that `log4js` has been asked to stop logging. Usually this is because of a fatal exception, or the application is being stopped. Your shutdown function should make sure that all asynchronous operations finish, and that any resources are cleaned up. The function must be named `shutdown`, take one callback argument, and be a property of the appender instance. Let's add a shutdown function to the `stdout` appender as an example. ## Example (shutdown) ```javascript // This is the function that generates an appender function function stdoutAppender(layout, timezoneOffset) { // This is the appender function itself const appender = (loggingEvent) => { process.stdout.write(`${layout(loggingEvent, timezoneOffset)}\n`); }; // add a shutdown function. appender.shutdown = (done) => { process.stdout.write("", done); }; return appender; } // ... rest of the code as above ```
# Writing Appenders for Log4js Log4js can load appenders from outside its core set. To add a custom appender, the easiest way is to make it a stand-alone module and publish to npm. You can also load appenders from your own application, but they must be defined in a module. ## Loading mechanism When log4js parses your configuration, it loops through the defined appenders. For each one, it will `require` the appender initially using the `type` value prepended with './appenders' as the module identifier - this is to try loading from the core appenders first. If that fails (the module could not be found in the core appenders), then log4js will try to require the module using variations of the `type` value. Log4js checks the following places (in this order) for appenders based on the type value: 1. Bundled core appenders (within appenders directory): `require('./' + type)` 2. node_modules: `require(type)` 3. relative to the main file of your application: `require(path.dirname(require.main.filename) + '/' + type)` 4. relative to the process' current working directory: `require(process.cwd() + '/' + type)` If that fails, an error will be raised. ## Appender Modules An appender module should export a single function called `configure`. The function should accept the following arguments: - `config` - `object` - the appender's configuration object - `layouts` - `module` - gives access to the [layouts](layouts.md) module, which most appenders will need - `layout` - `function(type, config)` - this is the main function that appenders will use to find a layout - `findAppender` - `function(name)` - if your appender is a wrapper around another appender (like the [logLevelFilter](logLevelFilter.md) for example), this function can be used to find another appender by name - `levels` - `module` - gives access to the [levels](levels.md) module, which most appenders will need `configure` should return a function which accepts a logEvent, which is the appender itself. One of the simplest examples is the [stdout](stdout.md) appender. Let's run through the code. ## Example ```javascript // This is the function that generates an appender function function stdoutAppender(layout, timezoneOffset) { // This is the appender function itself return (loggingEvent) => { process.stdout.write(`${layout(loggingEvent, timezoneOffset)}\n`); }; } // stdout configure doesn't need to use findAppender, or levels function configure(config, layouts) { // the default layout for the appender let layout = layouts.colouredLayout; // check if there is another layout specified if (config.layout) { // load the layout layout = layouts.layout(config.layout.type, config.layout); } //create a new appender instance return stdoutAppender(layout, config.timezoneOffset); } //export the only function needed exports.configure = configure; ``` # Shutdown functions It's a good idea to implement a `shutdown` function on your appender instances. This function will get called by `log4js.shutdown` and signals that `log4js` has been asked to stop logging. Usually this is because of a fatal exception, or the application is being stopped. Your shutdown function should make sure that all asynchronous operations finish, and that any resources are cleaned up. The function must be named `shutdown`, take one callback argument, and be a property of the appender instance. Let's add a shutdown function to the `stdout` appender as an example. ## Example (shutdown) ```javascript // This is the function that generates an appender function function stdoutAppender(layout, timezoneOffset) { // This is the appender function itself const appender = (loggingEvent) => { process.stdout.write(`${layout(loggingEvent, timezoneOffset)}\n`); }; // add a shutdown function. appender.shutdown = (done) => { process.stdout.write("", done); }; return appender; } // ... rest of the code as above ```
-1
log4js-node/log4js-node
1,333
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one
lamweili
"2022-10-01T17:05:08Z"
"2022-10-01T17:20:06Z"
570ef530dc02d3e843a5421cb015bb8fadfe0b41
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one.
./test/tap/setLevel-asymmetry-test.js
// This test shows an asymmetry between setLevel and isLevelEnabled // (in log4js-node@0.4.3 and earlier): // 1) setLevel("foo") works, but setLevel(log4js.levels.foo) silently // does not (sets the level to TRACE). // 2) isLevelEnabled("foo") works as does isLevelEnabled(log4js.levels.foo). // const { test } = require('tap'); const log4js = require('../../lib/log4js'); const logger = log4js.getLogger('test-setLevel-asymmetry'); // Define the array of levels as string to iterate over. const strLevels = ['Trace', 'Debug', 'Info', 'Warn', 'Error', 'Fatal']; const log4jsLevels = strLevels.map(log4js.levels.getLevel); test('log4js setLevel', (batch) => { strLevels.forEach((strLevel) => { batch.test(`is called with a ${strLevel} as string`, (t) => { const log4jsLevel = log4js.levels.getLevel(strLevel); t.test('should convert string to level correctly', (assert) => { logger.level = strLevel; log4jsLevels.forEach((level) => { assert.equal( logger.isLevelEnabled(level), log4jsLevel.isLessThanOrEqualTo(level) ); }); assert.end(); }); t.test('should also accept a Level', (assert) => { logger.level = log4jsLevel; log4jsLevels.forEach((level) => { assert.equal( logger.isLevelEnabled(level), log4jsLevel.isLessThanOrEqualTo(level) ); }); assert.end(); }); t.end(); }); }); batch.end(); });
// This test shows an asymmetry between setLevel and isLevelEnabled // (in log4js-node@0.4.3 and earlier): // 1) setLevel("foo") works, but setLevel(log4js.levels.foo) silently // does not (sets the level to TRACE). // 2) isLevelEnabled("foo") works as does isLevelEnabled(log4js.levels.foo). // const { test } = require('tap'); const log4js = require('../../lib/log4js'); const logger = log4js.getLogger('test-setLevel-asymmetry'); // Define the array of levels as string to iterate over. const strLevels = ['Trace', 'Debug', 'Info', 'Warn', 'Error', 'Fatal']; const log4jsLevels = strLevels.map(log4js.levels.getLevel); test('log4js setLevel', (batch) => { strLevels.forEach((strLevel) => { batch.test(`is called with a ${strLevel} as string`, (t) => { const log4jsLevel = log4js.levels.getLevel(strLevel); t.test('should convert string to level correctly', (assert) => { logger.level = strLevel; log4jsLevels.forEach((level) => { assert.equal( logger.isLevelEnabled(level), log4jsLevel.isLessThanOrEqualTo(level) ); }); assert.end(); }); t.test('should also accept a Level', (assert) => { logger.level = log4jsLevel; log4jsLevels.forEach((level) => { assert.equal( logger.isLevelEnabled(level), log4jsLevel.isLessThanOrEqualTo(level) ); }); assert.end(); }); t.end(); }); }); batch.end(); });
-1
log4js-node/log4js-node
1,333
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one
lamweili
"2022-10-01T17:05:08Z"
"2022-10-01T17:20:06Z"
570ef530dc02d3e843a5421cb015bb8fadfe0b41
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one.
./test/tap/levels-test.js
const { test } = require('tap'); const levels = require('../../lib/levels'); function assertThat(assert, level) { function assertForEach(assertion, testFn, otherLevels) { otherLevels.forEach((other) => { assertion.call(assert, testFn.call(level, other)); }); } return { isLessThanOrEqualTo(lvls) { assertForEach(assert.ok, level.isLessThanOrEqualTo, lvls); }, isNotLessThanOrEqualTo(lvls) { assertForEach(assert.notOk, level.isLessThanOrEqualTo, lvls); }, isGreaterThanOrEqualTo(lvls) { assertForEach(assert.ok, level.isGreaterThanOrEqualTo, lvls); }, isNotGreaterThanOrEqualTo(lvls) { assertForEach(assert.notOk, level.isGreaterThanOrEqualTo, lvls); }, isEqualTo(lvls) { assertForEach(assert.ok, level.isEqualTo, lvls); }, isNotEqualTo(lvls) { assertForEach(assert.notOk, level.isEqualTo, lvls); }, }; } test('levels', (batch) => { batch.test('values', (t) => { t.test('should define some levels', (assert) => { assert.ok(levels.ALL); assert.ok(levels.TRACE); assert.ok(levels.DEBUG); assert.ok(levels.INFO); assert.ok(levels.WARN); assert.ok(levels.ERROR); assert.ok(levels.FATAL); assert.ok(levels.MARK); assert.ok(levels.OFF); assert.end(); }); t.test('ALL', (assert) => { const all = levels.ALL; assertThat(assert, all).isLessThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, all).isNotGreaterThanOrEqualTo([ levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, all).isEqualTo([levels.getLevel('ALL')]); assertThat(assert, all).isNotEqualTo([ levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assert.end(); }); t.test('TRACE', (assert) => { const trace = levels.TRACE; assertThat(assert, trace).isLessThanOrEqualTo([ levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, trace).isNotLessThanOrEqualTo([levels.ALL]); assertThat(assert, trace).isGreaterThanOrEqualTo([ levels.ALL, levels.TRACE, ]); assertThat(assert, trace).isNotGreaterThanOrEqualTo([ levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, trace).isEqualTo([levels.getLevel('TRACE')]); assertThat(assert, trace).isNotEqualTo([ levels.ALL, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assert.end(); }); t.test('DEBUG', (assert) => { const debug = levels.DEBUG; assertThat(assert, debug).isLessThanOrEqualTo([ levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, debug).isNotLessThanOrEqualTo([ levels.ALL, levels.TRACE, ]); assertThat(assert, debug).isGreaterThanOrEqualTo([ levels.ALL, levels.TRACE, ]); assertThat(assert, debug).isNotGreaterThanOrEqualTo([ levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, debug).isEqualTo([levels.getLevel('DEBUG')]); assertThat(assert, debug).isNotEqualTo([ levels.ALL, levels.TRACE, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assert.end(); }); t.test('INFO', (assert) => { const info = levels.INFO; assertThat(assert, info).isLessThanOrEqualTo([ levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, info).isNotLessThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, ]); assertThat(assert, info).isGreaterThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, ]); assertThat(assert, info).isNotGreaterThanOrEqualTo([ levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, info).isEqualTo([levels.getLevel('INFO')]); assertThat(assert, info).isNotEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assert.end(); }); t.test('WARN', (assert) => { const warn = levels.WARN; assertThat(assert, warn).isLessThanOrEqualTo([ levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, warn).isNotLessThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, ]); assertThat(assert, warn).isGreaterThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, ]); assertThat(assert, warn).isNotGreaterThanOrEqualTo([ levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, warn).isEqualTo([levels.getLevel('WARN')]); assertThat(assert, warn).isNotEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.ERROR, levels.FATAL, levels.OFF, ]); assert.end(); }); t.test('ERROR', (assert) => { const error = levels.ERROR; assertThat(assert, error).isLessThanOrEqualTo([ levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, error).isNotLessThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, ]); assertThat(assert, error).isGreaterThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, ]); assertThat(assert, error).isNotGreaterThanOrEqualTo([ levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, error).isEqualTo([levels.getLevel('ERROR')]); assertThat(assert, error).isNotEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.FATAL, levels.MARK, levels.OFF, ]); assert.end(); }); t.test('FATAL', (assert) => { const fatal = levels.FATAL; assertThat(assert, fatal).isLessThanOrEqualTo([levels.MARK, levels.OFF]); assertThat(assert, fatal).isNotLessThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, ]); assertThat(assert, fatal).isGreaterThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, ]); assertThat(assert, fatal).isNotGreaterThanOrEqualTo([ levels.MARK, levels.OFF, ]); assertThat(assert, fatal).isEqualTo([levels.getLevel('FATAL')]); assertThat(assert, fatal).isNotEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.MARK, levels.OFF, ]); assert.end(); }); t.test('MARK', (assert) => { const mark = levels.MARK; assertThat(assert, mark).isLessThanOrEqualTo([levels.OFF]); assertThat(assert, mark).isNotLessThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.FATAL, levels.ERROR, ]); assertThat(assert, mark).isGreaterThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, ]); assertThat(assert, mark).isNotGreaterThanOrEqualTo([levels.OFF]); assertThat(assert, mark).isEqualTo([levels.getLevel('MARK')]); assertThat(assert, mark).isNotEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.OFF, ]); assert.end(); }); t.test('OFF', (assert) => { const off = levels.OFF; assertThat(assert, off).isNotLessThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, ]); assertThat(assert, off).isGreaterThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, ]); assertThat(assert, off).isEqualTo([levels.getLevel('OFF')]); assertThat(assert, off).isNotEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, ]); assert.end(); }); t.end(); }); batch.test('isGreaterThanOrEqualTo', (t) => { const info = levels.INFO; assertThat(t, info).isGreaterThanOrEqualTo(['all', 'trace', 'debug']); assertThat(t, info).isNotGreaterThanOrEqualTo([ 'warn', 'ERROR', 'Fatal', 'MARK', 'off', ]); t.end(); }); batch.test('isLessThanOrEqualTo', (t) => { const info = levels.INFO; assertThat(t, info).isNotLessThanOrEqualTo(['all', 'trace', 'debug']); assertThat(t, info).isLessThanOrEqualTo([ 'warn', 'ERROR', 'Fatal', 'MARK', 'off', ]); t.end(); }); batch.test('isEqualTo', (t) => { const info = levels.INFO; assertThat(t, info).isEqualTo(['info', 'INFO', 'iNfO']); t.end(); }); batch.test('getLevel', (t) => { t.equal(levels.getLevel('debug'), levels.DEBUG); t.equal(levels.getLevel('DEBUG'), levels.DEBUG); t.equal(levels.getLevel('DeBuG'), levels.DEBUG); t.notOk(levels.getLevel('cheese')); t.equal(levels.getLevel('cheese', levels.DEBUG), levels.DEBUG); t.equal( levels.getLevel({ level: 10000, levelStr: 'DEBUG', colour: 'cyan' }), levels.DEBUG ); t.end(); }); batch.end(); });
const { test } = require('tap'); const levels = require('../../lib/levels'); function assertThat(assert, level) { function assertForEach(assertion, testFn, otherLevels) { otherLevels.forEach((other) => { assertion.call(assert, testFn.call(level, other)); }); } return { isLessThanOrEqualTo(lvls) { assertForEach(assert.ok, level.isLessThanOrEqualTo, lvls); }, isNotLessThanOrEqualTo(lvls) { assertForEach(assert.notOk, level.isLessThanOrEqualTo, lvls); }, isGreaterThanOrEqualTo(lvls) { assertForEach(assert.ok, level.isGreaterThanOrEqualTo, lvls); }, isNotGreaterThanOrEqualTo(lvls) { assertForEach(assert.notOk, level.isGreaterThanOrEqualTo, lvls); }, isEqualTo(lvls) { assertForEach(assert.ok, level.isEqualTo, lvls); }, isNotEqualTo(lvls) { assertForEach(assert.notOk, level.isEqualTo, lvls); }, }; } test('levels', (batch) => { batch.test('values', (t) => { t.test('should define some levels', (assert) => { assert.ok(levels.ALL); assert.ok(levels.TRACE); assert.ok(levels.DEBUG); assert.ok(levels.INFO); assert.ok(levels.WARN); assert.ok(levels.ERROR); assert.ok(levels.FATAL); assert.ok(levels.MARK); assert.ok(levels.OFF); assert.end(); }); t.test('ALL', (assert) => { const all = levels.ALL; assertThat(assert, all).isLessThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, all).isNotGreaterThanOrEqualTo([ levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, all).isEqualTo([levels.getLevel('ALL')]); assertThat(assert, all).isNotEqualTo([ levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assert.end(); }); t.test('TRACE', (assert) => { const trace = levels.TRACE; assertThat(assert, trace).isLessThanOrEqualTo([ levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, trace).isNotLessThanOrEqualTo([levels.ALL]); assertThat(assert, trace).isGreaterThanOrEqualTo([ levels.ALL, levels.TRACE, ]); assertThat(assert, trace).isNotGreaterThanOrEqualTo([ levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, trace).isEqualTo([levels.getLevel('TRACE')]); assertThat(assert, trace).isNotEqualTo([ levels.ALL, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assert.end(); }); t.test('DEBUG', (assert) => { const debug = levels.DEBUG; assertThat(assert, debug).isLessThanOrEqualTo([ levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, debug).isNotLessThanOrEqualTo([ levels.ALL, levels.TRACE, ]); assertThat(assert, debug).isGreaterThanOrEqualTo([ levels.ALL, levels.TRACE, ]); assertThat(assert, debug).isNotGreaterThanOrEqualTo([ levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, debug).isEqualTo([levels.getLevel('DEBUG')]); assertThat(assert, debug).isNotEqualTo([ levels.ALL, levels.TRACE, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assert.end(); }); t.test('INFO', (assert) => { const info = levels.INFO; assertThat(assert, info).isLessThanOrEqualTo([ levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, info).isNotLessThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, ]); assertThat(assert, info).isGreaterThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, ]); assertThat(assert, info).isNotGreaterThanOrEqualTo([ levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, info).isEqualTo([levels.getLevel('INFO')]); assertThat(assert, info).isNotEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assert.end(); }); t.test('WARN', (assert) => { const warn = levels.WARN; assertThat(assert, warn).isLessThanOrEqualTo([ levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, warn).isNotLessThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, ]); assertThat(assert, warn).isGreaterThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, ]); assertThat(assert, warn).isNotGreaterThanOrEqualTo([ levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, warn).isEqualTo([levels.getLevel('WARN')]); assertThat(assert, warn).isNotEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.ERROR, levels.FATAL, levels.OFF, ]); assert.end(); }); t.test('ERROR', (assert) => { const error = levels.ERROR; assertThat(assert, error).isLessThanOrEqualTo([ levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, error).isNotLessThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, ]); assertThat(assert, error).isGreaterThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, ]); assertThat(assert, error).isNotGreaterThanOrEqualTo([ levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, error).isEqualTo([levels.getLevel('ERROR')]); assertThat(assert, error).isNotEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.FATAL, levels.MARK, levels.OFF, ]); assert.end(); }); t.test('FATAL', (assert) => { const fatal = levels.FATAL; assertThat(assert, fatal).isLessThanOrEqualTo([levels.MARK, levels.OFF]); assertThat(assert, fatal).isNotLessThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, ]); assertThat(assert, fatal).isGreaterThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, ]); assertThat(assert, fatal).isNotGreaterThanOrEqualTo([ levels.MARK, levels.OFF, ]); assertThat(assert, fatal).isEqualTo([levels.getLevel('FATAL')]); assertThat(assert, fatal).isNotEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.MARK, levels.OFF, ]); assert.end(); }); t.test('MARK', (assert) => { const mark = levels.MARK; assertThat(assert, mark).isLessThanOrEqualTo([levels.OFF]); assertThat(assert, mark).isNotLessThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.FATAL, levels.ERROR, ]); assertThat(assert, mark).isGreaterThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, ]); assertThat(assert, mark).isNotGreaterThanOrEqualTo([levels.OFF]); assertThat(assert, mark).isEqualTo([levels.getLevel('MARK')]); assertThat(assert, mark).isNotEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.OFF, ]); assert.end(); }); t.test('OFF', (assert) => { const off = levels.OFF; assertThat(assert, off).isNotLessThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, ]); assertThat(assert, off).isGreaterThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, ]); assertThat(assert, off).isEqualTo([levels.getLevel('OFF')]); assertThat(assert, off).isNotEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, ]); assert.end(); }); t.end(); }); batch.test('isGreaterThanOrEqualTo', (t) => { const info = levels.INFO; assertThat(t, info).isGreaterThanOrEqualTo(['all', 'trace', 'debug']); assertThat(t, info).isNotGreaterThanOrEqualTo([ 'warn', 'ERROR', 'Fatal', 'MARK', 'off', ]); t.end(); }); batch.test('isLessThanOrEqualTo', (t) => { const info = levels.INFO; assertThat(t, info).isNotLessThanOrEqualTo(['all', 'trace', 'debug']); assertThat(t, info).isLessThanOrEqualTo([ 'warn', 'ERROR', 'Fatal', 'MARK', 'off', ]); t.end(); }); batch.test('isEqualTo', (t) => { const info = levels.INFO; assertThat(t, info).isEqualTo(['info', 'INFO', 'iNfO']); t.end(); }); batch.test('getLevel', (t) => { t.equal(levels.getLevel('debug'), levels.DEBUG); t.equal(levels.getLevel('DEBUG'), levels.DEBUG); t.equal(levels.getLevel('DeBuG'), levels.DEBUG); t.notOk(levels.getLevel('cheese')); t.equal(levels.getLevel('cheese', levels.DEBUG), levels.DEBUG); t.equal( levels.getLevel({ level: 10000, levelStr: 'DEBUG', colour: 'cyan' }), levels.DEBUG ); t.end(); }); batch.end(); });
-1
log4js-node/log4js-node
1,333
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one
lamweili
"2022-10-01T17:05:08Z"
"2022-10-01T17:20:06Z"
570ef530dc02d3e843a5421cb015bb8fadfe0b41
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one.
./test/tap/dateFileAppender-test.js
/* eslint max-classes-per-file: ["error", 3] */ const { test } = require('tap'); const path = require('path'); const fs = require('fs'); const EOL = require('os').EOL || '\n'; const format = require('date-format'); const sandbox = require('@log4js-node/sandboxed-module'); const log4js = require('../../lib/log4js'); const osDelay = process.platform === 'win32' ? 400 : 200; function removeFile(filename) { try { fs.unlinkSync(path.join(__dirname, filename)); } catch (e) { // doesn't matter } } test('../../lib/appenders/dateFile', (batch) => { batch.test('with default settings', (t) => { const testFile = path.join(__dirname, 'date-appender-default.log'); log4js.configure({ appenders: { date: { type: 'dateFile', filename: testFile } }, categories: { default: { appenders: ['date'], level: 'DEBUG' } }, }); const logger = log4js.getLogger('default-settings'); logger.info('This should be in the file.'); t.teardown(() => { removeFile('date-appender-default.log'); }); setTimeout(() => { fs.readFile(testFile, 'utf8', (err, contents) => { t.match(contents, 'This should be in the file'); t.match( contents, /\[\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}\.\d{3}] \[INFO] default-settings - / ); t.end(); }); }, osDelay); }); batch.test('configure with dateFileAppender', (t) => { log4js.configure({ appenders: { date: { type: 'dateFile', filename: 'test/tap/date-file-test.log', pattern: '-yyyy-MM-dd', layout: { type: 'messagePassThrough' }, }, }, categories: { default: { appenders: ['date'], level: 'WARN' } }, }); const logger = log4js.getLogger('tests'); logger.info('this should not be written to the file'); logger.warn('this should be written to the file'); log4js.shutdown(() => { fs.readFile( path.join(__dirname, 'date-file-test.log'), 'utf8', (err, contents) => { t.match(contents, `this should be written to the file${EOL}`); t.equal( contents.indexOf('this should not be written to the file'), -1 ); t.end(); } ); }); t.teardown(() => { removeFile('date-file-test.log'); }); }); batch.test('configure with options.alwaysIncludePattern', (t) => { const options = { appenders: { date: { category: 'tests', type: 'dateFile', filename: 'test/tap/date-file-test', pattern: 'yyyy-MM-dd.log', alwaysIncludePattern: true, layout: { type: 'messagePassThrough', }, }, }, categories: { default: { appenders: ['date'], level: 'debug' } }, }; const thisTime = format.asString( options.appenders.date.pattern, new Date() ); const testFile = `date-file-test.${thisTime}`; const existingFile = path.join(__dirname, testFile); fs.writeFileSync(existingFile, `this is existing data${EOL}`, 'utf8'); log4js.configure(options); const logger = log4js.getLogger('tests'); logger.warn('this should be written to the file with the appended date'); t.teardown(() => { removeFile(testFile); }); // wait for filesystem to catch up log4js.shutdown(() => { fs.readFile(existingFile, 'utf8', (err, contents) => { t.match( contents, 'this is existing data', 'should not overwrite the file on open (issue #132)' ); t.match( contents, 'this should be written to the file with the appended date' ); t.end(); }); }); }); batch.test('should flush logs on shutdown', (t) => { const testFile = path.join(__dirname, 'date-appender-flush.log'); log4js.configure({ appenders: { test: { type: 'dateFile', filename: testFile } }, categories: { default: { appenders: ['test'], level: 'trace' } }, }); const logger = log4js.getLogger('default-settings'); logger.info('1'); logger.info('2'); logger.info('3'); t.teardown(() => { removeFile('date-appender-flush.log'); }); log4js.shutdown(() => { fs.readFile(testFile, 'utf8', (err, fileContents) => { // 3 lines of output, plus the trailing newline. t.equal(fileContents.split(EOL).length, 4); t.match( fileContents, /\[\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}\.\d{3}] \[INFO] default-settings - / ); t.end(); }); }); }); batch.test('should map maxLogSize to maxSize', (t) => { const fakeStreamroller = {}; class DateRollingFileStream { constructor(filename, pattern, options) { fakeStreamroller.filename = filename; fakeStreamroller.pattern = pattern; fakeStreamroller.options = options; } on() {} // eslint-disable-line class-methods-use-this } fakeStreamroller.DateRollingFileStream = DateRollingFileStream; const dateFileAppenderModule = sandbox.require( '../../lib/appenders/dateFile', { requires: { streamroller: fakeStreamroller }, } ); dateFileAppenderModule.configure( { filename: 'cheese.log', pattern: 'yyyy', maxLogSize: 100, }, { basicLayout: () => {} } ); t.equal(fakeStreamroller.options.maxSize, 100); t.end(); }); batch.test('handling of writer.writable', (t) => { const output = []; let writable = true; const DateRollingFileStream = class { write(loggingEvent) { output.push(loggingEvent); this.written = true; return true; } // eslint-disable-next-line class-methods-use-this on() {} // eslint-disable-next-line class-methods-use-this get writable() { return writable; } }; const dateFileAppender = sandbox.require('../../lib/appenders/dateFile', { requires: { streamroller: { DateRollingFileStream, }, }, }); const appender = dateFileAppender.configure( { filename: 'test1.log', maxLogSize: 100 }, { basicLayout(loggingEvent) { return loggingEvent.data; }, } ); t.test('should log when writer.writable=true', (assert) => { writable = true; appender({ data: 'something to log' }); assert.ok(output.length, 1); assert.match(output[output.length - 1], 'something to log'); assert.end(); }); t.test('should not log when writer.writable=false', (assert) => { writable = false; appender({ data: 'this should not be logged' }); assert.ok(output.length, 1); assert.notMatch(output[output.length - 1], 'this should not be logged'); assert.end(); }); t.end(); }); batch.test('when underlying stream errors', (t) => { let consoleArgs; let errorHandler; const DateRollingFileStream = class { end() { this.ended = true; } on(evt, cb) { if (evt === 'error') { this.errored = true; errorHandler = cb; } } write() { this.written = true; return true; } }; const dateFileAppender = sandbox.require('../../lib/appenders/dateFile', { globals: { console: { error(...args) { consoleArgs = args; }, }, }, requires: { streamroller: { DateRollingFileStream, }, }, }); dateFileAppender.configure( { filename: 'test1.log', maxLogSize: 100 }, { basicLayout() {} } ); errorHandler({ error: 'aargh' }); t.test('should log the error to console.error', (assert) => { assert.ok(consoleArgs); assert.equal( consoleArgs[0], 'log4js.dateFileAppender - Writing to file %s, error happened ' ); assert.equal(consoleArgs[1], 'test1.log'); assert.equal(consoleArgs[2].error, 'aargh'); assert.end(); }); t.end(); }); batch.end(); });
/* eslint max-classes-per-file: ["error", 3] */ const { test } = require('tap'); const path = require('path'); const fs = require('fs'); const EOL = require('os').EOL || '\n'; const format = require('date-format'); const sandbox = require('@log4js-node/sandboxed-module'); const log4js = require('../../lib/log4js'); const osDelay = process.platform === 'win32' ? 400 : 200; function removeFile(filename) { try { fs.unlinkSync(path.join(__dirname, filename)); } catch (e) { // doesn't matter } } test('../../lib/appenders/dateFile', (batch) => { batch.test('with default settings', (t) => { const testFile = path.join(__dirname, 'date-appender-default.log'); log4js.configure({ appenders: { date: { type: 'dateFile', filename: testFile } }, categories: { default: { appenders: ['date'], level: 'DEBUG' } }, }); const logger = log4js.getLogger('default-settings'); logger.info('This should be in the file.'); t.teardown(() => { removeFile('date-appender-default.log'); }); setTimeout(() => { fs.readFile(testFile, 'utf8', (err, contents) => { t.match(contents, 'This should be in the file'); t.match( contents, /\[\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}\.\d{3}] \[INFO] default-settings - / ); t.end(); }); }, osDelay); }); batch.test('configure with dateFileAppender', (t) => { log4js.configure({ appenders: { date: { type: 'dateFile', filename: 'test/tap/date-file-test.log', pattern: '-yyyy-MM-dd', layout: { type: 'messagePassThrough' }, }, }, categories: { default: { appenders: ['date'], level: 'WARN' } }, }); const logger = log4js.getLogger('tests'); logger.info('this should not be written to the file'); logger.warn('this should be written to the file'); log4js.shutdown(() => { fs.readFile( path.join(__dirname, 'date-file-test.log'), 'utf8', (err, contents) => { t.match(contents, `this should be written to the file${EOL}`); t.equal( contents.indexOf('this should not be written to the file'), -1 ); t.end(); } ); }); t.teardown(() => { removeFile('date-file-test.log'); }); }); batch.test('configure with options.alwaysIncludePattern', (t) => { const options = { appenders: { date: { category: 'tests', type: 'dateFile', filename: 'test/tap/date-file-test', pattern: 'yyyy-MM-dd.log', alwaysIncludePattern: true, layout: { type: 'messagePassThrough', }, }, }, categories: { default: { appenders: ['date'], level: 'debug' } }, }; const thisTime = format.asString( options.appenders.date.pattern, new Date() ); const testFile = `date-file-test.${thisTime}`; const existingFile = path.join(__dirname, testFile); fs.writeFileSync(existingFile, `this is existing data${EOL}`, 'utf8'); log4js.configure(options); const logger = log4js.getLogger('tests'); logger.warn('this should be written to the file with the appended date'); t.teardown(() => { removeFile(testFile); }); // wait for filesystem to catch up log4js.shutdown(() => { fs.readFile(existingFile, 'utf8', (err, contents) => { t.match( contents, 'this is existing data', 'should not overwrite the file on open (issue #132)' ); t.match( contents, 'this should be written to the file with the appended date' ); t.end(); }); }); }); batch.test('should flush logs on shutdown', (t) => { const testFile = path.join(__dirname, 'date-appender-flush.log'); log4js.configure({ appenders: { test: { type: 'dateFile', filename: testFile } }, categories: { default: { appenders: ['test'], level: 'trace' } }, }); const logger = log4js.getLogger('default-settings'); logger.info('1'); logger.info('2'); logger.info('3'); t.teardown(() => { removeFile('date-appender-flush.log'); }); log4js.shutdown(() => { fs.readFile(testFile, 'utf8', (err, fileContents) => { // 3 lines of output, plus the trailing newline. t.equal(fileContents.split(EOL).length, 4); t.match( fileContents, /\[\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}\.\d{3}] \[INFO] default-settings - / ); t.end(); }); }); }); batch.test('should map maxLogSize to maxSize', (t) => { const fakeStreamroller = {}; class DateRollingFileStream { constructor(filename, pattern, options) { fakeStreamroller.filename = filename; fakeStreamroller.pattern = pattern; fakeStreamroller.options = options; } on() {} // eslint-disable-line class-methods-use-this } fakeStreamroller.DateRollingFileStream = DateRollingFileStream; const dateFileAppenderModule = sandbox.require( '../../lib/appenders/dateFile', { requires: { streamroller: fakeStreamroller }, } ); dateFileAppenderModule.configure( { filename: 'cheese.log', pattern: 'yyyy', maxLogSize: 100, }, { basicLayout: () => {} } ); t.equal(fakeStreamroller.options.maxSize, 100); t.end(); }); batch.test('handling of writer.writable', (t) => { const output = []; let writable = true; const DateRollingFileStream = class { write(loggingEvent) { output.push(loggingEvent); this.written = true; return true; } // eslint-disable-next-line class-methods-use-this on() {} // eslint-disable-next-line class-methods-use-this get writable() { return writable; } }; const dateFileAppender = sandbox.require('../../lib/appenders/dateFile', { requires: { streamroller: { DateRollingFileStream, }, }, }); const appender = dateFileAppender.configure( { filename: 'test1.log', maxLogSize: 100 }, { basicLayout(loggingEvent) { return loggingEvent.data; }, } ); t.test('should log when writer.writable=true', (assert) => { writable = true; appender({ data: 'something to log' }); assert.ok(output.length, 1); assert.match(output[output.length - 1], 'something to log'); assert.end(); }); t.test('should not log when writer.writable=false', (assert) => { writable = false; appender({ data: 'this should not be logged' }); assert.ok(output.length, 1); assert.notMatch(output[output.length - 1], 'this should not be logged'); assert.end(); }); t.end(); }); batch.test('when underlying stream errors', (t) => { let consoleArgs; let errorHandler; const DateRollingFileStream = class { end() { this.ended = true; } on(evt, cb) { if (evt === 'error') { this.errored = true; errorHandler = cb; } } write() { this.written = true; return true; } }; const dateFileAppender = sandbox.require('../../lib/appenders/dateFile', { globals: { console: { error(...args) { consoleArgs = args; }, }, }, requires: { streamroller: { DateRollingFileStream, }, }, }); dateFileAppender.configure( { filename: 'test1.log', maxLogSize: 100 }, { basicLayout() {} } ); errorHandler({ error: 'aargh' }); t.test('should log the error to console.error', (assert) => { assert.ok(consoleArgs); assert.equal( consoleArgs[0], 'log4js.dateFileAppender - Writing to file %s, error happened ' ); assert.equal(consoleArgs[1], 'test1.log'); assert.equal(consoleArgs[2].error, 'aargh'); assert.end(); }); t.end(); }); batch.end(); });
-1
log4js-node/log4js-node
1,333
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one
lamweili
"2022-10-01T17:05:08Z"
"2022-10-01T17:20:06Z"
570ef530dc02d3e843a5421cb015bb8fadfe0b41
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one.
./CHANGELOG.md
# log4js-node Changelog ## 6.6.1 - [fix: connectlogger nolog function](https://github.com/log4js-node/log4js-node/pull/1285) - thanks [@eyoboue](https://github.com/eyoboue) - [type: corrected AppenderModule interface and Recording interface](https://github.com/log4js-node/log4js-node/pull/1304) - thanks [@lamweili](https://github.com/lamweili) - test: extended timeout interval for OS operations - thanks [@lamweili](https://github.com/lamweili) - test: [#1306](https://github.com/log4js-node/log4js-node/pull/1306) - test: [#1297](https://github.com/log4js-node/log4js-node/pull/1297) - [test: support older Node.js versions](https://github.com/log4js-node/log4js-node/pull/1295) - thanks [@lamweili](https://github.com/lamweili) - [ci: added tests for Node.js 8.x](https://github.com/log4js-node/log4js-node/pull/1303) - thanks [@lamweili](https://github.com/lamweili) - [ci: added tests for Node.js 10.x, 18.x](https://github.com/log4js-node/log4js-node/pull/1301) - thanks [@lamweili](https://github.com/lamweili) - [ci: updated codeql from v1 to v2](https://github.com/log4js-node/log4js-node/pull/1302) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps-dev): bump lodash from 4.17.19 to 4.17.21](https://github.com/log4js-node/log4js-node/pull/1309) - thanks [@Dependabot](https://github.com/dependabot) - [chore(deps-dev): bump path-parse from 1.0.6 to 1.0.7](https://github.com/log4js-node/log4js-node/pull/1308) - thanks [@Dependabot](https://github.com/dependabot) - [chore(deps-dev): downgraded nyc from 15.1.0 to 14.1.1](https://github.com/log4js-node/log4js-node/pull/1305) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps): updated dependencies](https://github.com/log4js-node/log4js-node/pull/1296) - thanks [@lamweili](https://github.com/lamweili) - chore(deps): bump date-format from 4.0.11 to 4.0.13 - chore(deps): bump flatted from 3.2.5 to 3.2.6 - chore(deps): bump streamroller from 3.1.1 to 3.1.2 - chore(deps-dev): bump @commitlint/cli from 17.0.2 to 17.0.3 - chore(deps-dev): bump @commitlint/config-conventional from 17.0.2 to 17.0.3 - [chore(deps-dev): bump eslint from 8.16.0 to 8.20.0](https://github.com/log4js-node/log4js-node/pull/1300) - chore(deps-dev): bump eslint-plugin-prettier from 4.0.0 to 4.2.1 - chore(deps-dev): bump prettier from 2.6.0 to 2.7.1 - chore(deps-dev): bump tap from 16.2.0 to 16.3.0 - chore(deps-dev): bump typescript from 4.7.2 to 4.7.4 - chore(deps): updated package-lock.json ## 6.6.0 - [feat: adding function(req, res) support to connectLogger nolog](https://github.com/log4js-node/log4js-node/pull/1279) - thanks [@eyoboue](https://github.com/eyoboue) - [fix: ability to load CJS appenders (through .cjs extension) for ESM packages](https://github.com/log4js-node/log4js-node/pull/1280) - thanks [@lamweili](https://github.com/lamweili) - [type: consistent typing for Logger](https://github.com/log4js-node/log4js-node/pull/1276) - thanks [@taozi0818](https://github.com/taozi0818) - [type: Make Appender Type extensible from other modules and the user](https://github.com/log4js-node/log4js-node/pull/1267) - thanks [@ZachHaber](https://github.com/ZachHaber) - [refactor: clearer logic for invalid level and LOG synonym](https://github.com/log4js-node/log4js-node/pull/1264) - thanks [@lamweili](https://github.com/lamweili) - [style: ran prettier and requires prettier for ci](https://github.com/log4js-node/log4js-node/pull/1271) - thanks [@ZachHaber](https://github.com/ZachHaber) - [docs: renamed peteriman to lamweili in changelog](https://github.com/log4js-node/log4js-node/pull/1272) - thanks [@lamweili](https://github.com/lamweili) - [ci: replaced validate-commit-msg, fixed husky config, removed codecov](https://github.com/log4js-node/log4js-node/pull/1274) - thanks [@ZachHaber](https://github.com/ZachHaber) - [chore(deps): updated dependencies](https://github.com/log4js-node/log4js-node/pull/1266) - thanks [@lamweili](https://github.com/lamweili) - chore(deps-dev): bump typescript from 4.6.4 to 4.7.2 - chore(deps): bump date-format from 4.0.10 to 4.0.11 - chore(deps): updated package-lock.json ## 6.5.2 - [type: add LogEvent.serialise](https://github.com/log4js-node/log4js-node/pull/1260) - thanks [@marrowleaves](https://github.com/marrowleaves) ## 6.5.1 - [fix: fs.appendFileSync should use flag instead of flags](https://github.com/log4js-node/log4js-node/pull/1257) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps): updated dependencies](https://github.com/log4js-node/log4js-node/pull/1258) - thanks [@lamweili](https://github.com/lamweili) - chore(deps): bump streamroller from 3.1.0 to 3.1.1 - chore(deps): updated package-lock.json ## 6.5.0 - [feat: logger.log() to be synonym of logger.info()](https://github.com/log4js-node/log4js-node/pull/1254) - thanks [@lamweili](https://github.com/lamweili) - [feat: tilde expansion for filename](https://github.com/log4js-node/log4js-node/pull/1252) - thanks [@lamweili](https://github.com/lamweili) - [fix: better file validation](https://github.com/log4js-node/log4js-node/pull/1251) - thanks [@lamweili](https://github.com/lamweili) - [fix: fallback for logger.log outputs nothing](https://github.com/log4js-node/log4js-node/pull/1247) - thanks [@lamweili](https://github.com/lamweili) - [docs: updated fileAppender maxLogSize documentation](https://github.com/log4js-node/log4js-node/pull/1248) - thanks [@lamweili](https://github.com/lamweili) - [ci: enforced 100% test coverage tests](https://github.com/log4js-node/log4js-node/pull/1253) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps): updated dependencies](https://github.com/log4js-node/log4js-node/pull/1256) - thanks [@lamweili](https://github.com/lamweili) - chore(deps-dev): bump eslint from 8.15.0 to 8.16.0 - chore(deps): bump streamroller from 3.0.9 to 3.1.0 - chore(deps): updated package-lock.json ## 6.4.7 - [fix: dateFileAppender unable to use units in maxLogSize](https://github.com/log4js-node/log4js-node/pull/1243) - thanks [@lamweili](https://github.com/lamweili) - [type: added fileNameSep for FileAppender and DateFileAppender](https://github.com/log4js-node/log4js-node/pull/1241) - thanks [@lamweili](https://github.com/lamweili) - [docs: updated usage of units for maxLogSize](https://github.com/log4js-node/log4js-node/pull/1242) - thanks [@lamweili](https://github.com/lamweili) - [docs: updated comments in typescript def](https://github.com/log4js-node/log4js-node/pull/1240) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps): updated dependencies](https://github.com/log4js-node/log4js-node/pull/1244) - thanks [@lamweili](https://github.com/lamweili) - chore(deps-dev): bump eslint from 8.14.0 to 8.15.0 - chore(deps-dev): bump husky from 7.0.4 to 8.0.1 - chore(deps-dev): bump tap from 16.1.0 to 16.2.0 - chore(deps-dev): bump typescript from 4.6.3 to 4.6.4 - chore(deps): bump date-format from 4.0.9 to 4.0.10 - chore(deps): bump streamroller from 3.0.8 to 3.0.9 - chore(deps): updated package-lock.json - [chore(deps): updated dependencies](https://github.com/log4js-node/log4js-node/pull/1238) - thanks [@lamweili](https://github.com/lamweili) - chore(deps-dev): bump tap from 16.0.1 to 16.1.0 - chore(deps-dev): updated package-lock.json ## 6.4.6 - [chore(deps): updated dependencies](https://github.com/log4js-node/log4js-node/pull/1236) - thanks [@lamweili](https://github.com/lamweili) - chore(deps-dev): bump eslint from 8.13.0 to 8.14.0 - chore(deps): bump date-format from 4.0.7 to 4.0.9 - chore(deps): bump streamroller from 3.0.7 to 3.0.8 - fix: [#1216](https://github.com/log4js-node/log4js-node/issues/1216) where promise rejection is not handled ([streamroller@3.0.8 changelog](https://github.com/log4js-node/streamroller/blob/master/CHANGELOG.md)) - chore(deps): updated package-lock.json - [chore(deps): updated dependencies](https://github.com/log4js-node/log4js-node/pull/1234) - thanks [@lamweili](https://github.com/lamweili) - chore(deps): bump fs-extra from 10.0.1 to 10.1.0 - chore(deps): updated package-lock.json ## 6.4.5 - [fix: deserialise for enableCallStack features: filename, lineNumber, columnNumber, callStack](https://github.com/log4js-node/log4js-node/pull/1230) - thanks [@lamweili](https://github.com/lamweili) - [fix: fileDepth for ESM](https://github.com/log4js-node/log4js-node/pull/1224) - thanks [@lamweili](https://github.com/lamweili) - [refactor: replace deprecated String.prototype.substr()](https://github.com/log4js-node/log4js-node/pull/1223) - thanks [@CommanderRoot](https://github.com/CommanderRoot) - [type: LogEvent types](https://github.com/log4js-node/log4js-node/pull/1231) - thanks [@lamweili](https://github.com/lamweili) - [docs: updated typescript usage](https://github.com/log4js-node/log4js-node/pull/1229) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps): updated dependencies](https://github.com/log4js-node/log4js-node/pull/1232) - thanks [@lamweili](https://github.com/lamweili) - chore(deps): bump date-format from 4.0.6 to 4.0.7 - chore(deps): bump streamroller from 3.0.6 to 3.0.7 - fix: [#1225](https://github.com/log4js-node/log4js-node/issues/1225) where fs-extra throws error when fs.realpath.native is undefined ([streamroller@3.0.7 changelog](https://github.com/log4js-node/streamroller/blob/master/CHANGELOG.md)) - chore(deps): updated package-lock.json - [chore(deps): updated dependencies](https://github.com/log4js-node/log4js-node/pull/1228) - thanks [@lamweili](https://github.com/lamweili) - chore(deps-dev): bump eslint from 8.11.0 to 8.13.0 - chore(deps-dev): bump eslint-plugin-import from 2.25.4 to 2.26.0 - chore(deps-dev): bump tap from 16.0.0 to 16.0.1 - chore(deps-dev): bump typescript from 4.6.2 to 4.6.3 - chore(deps-dev): updated package-lock.json - [chore(deps-dev): bump minimist from 1.2.5 to 1.2.6](https://github.com/log4js-node/log4js-node/pull/1227) - thanks [@Dependabot](https://github.com/dependabot) ## 6.4.4 - [fix: set logger.level on runtime will no longer wrongly reset useCallStack](https://github.com/log4js-node/log4js-node/pull/1217) - thanks [@lamweili](https://github.com/lamweili) - [docs: updated docs for broken links and inaccessible pages](https://github.com/log4js-node/log4js-node/pull/1219) - thanks [@lamweili](https://github.com/lamweili) - [docs: broken link to gelf appender](https://github.com/log4js-node/log4js-node/pull/1218) - thanks [@mattalexx](https://github.com/mattalexx) - [docs: updated docs for appenders module loading](https://github.com/log4js-node/log4js-node/pull/985) - thanks [@leonimurilo](https://github.com/leonimurilo) - [chore(deps): updated dependencies](https://github.com/log4js-node/log4js-node/pull/1221) - thanks [@lamweili](https://github.com/lamweili) - chore(deps): bump streamroller from 3.0.5 to 3.0.6 - chore(deps): bump debug from 4.3.3 to 4.3.4 - chore(deps): bump date-format from 4.0.5 to 4.0.6 - chore(deps-dev): bump prettier from 2.5.1 to 2.6.0 - chore(deps): updated package-lock.json ## 6.4.3 - [fix: added filename validation](https://github.com/log4js-node/log4js-node/pull/1201) - thanks [@lamweili](https://github.com/lamweili) - [refactor: do not initialise default appenders as it will be done again by configure()](https://github.com/log4js-node/log4js-node/pull/1210) - thanks [@lamweili](https://github.com/lamweili) - [refactor: defensive coding for cluster=null if require('cluster') fails in try-catch ](https://github.com/log4js-node/log4js-node/pull/1199) - thanks [@lamweili](https://github.com/lamweili) - [refactor: removed redundant logic in tcp-serverAppender](https://github.com/log4js-node/log4js-node/pull/1198) - thanks [@lamweili](https://github.com/lamweili) - [refactor: removed redundant logic in multiprocessAppender](https://github.com/log4js-node/log4js-node/pull/1197) - thanks [@lamweili](https://github.com/lamweili) - test: 100% test coverage - thanks [@lamweili](https://github.com/lamweili) - test: part 1 of 3: [#1200](https://github.com/log4js-node/log4js-node/pull/1200) - test: part 2 of 3: [#1204](https://github.com/log4js-node/log4js-node/pull/1204) - test: part 3 of 3: [#1205](https://github.com/log4js-node/log4js-node/pull/1205) - [test: improved test cases](https://github.com/log4js-node/log4js-node/pull/1211) - [docs: updated README.md with badges](https://github.com/log4js-node/log4js-node/pull/1209) - thanks [@lamweili](https://github.com/lamweili) - [docs: added docs for istanbul ignore](https://github.com/log4js-node/log4js-node/pull/1208) - thanks [@lamweili](https://github.com/lamweili) - [docs: updated logger api docs](https://github.com/log4js-node/log4js-node/pull/1203) - thanks [@lamweili](https://github.com/lamweili) - [docs: updated file and fileSync appender docs](https://github.com/log4js-node/log4js-node/pull/1202) - thanks [@lamweili](https://github.com/lamweili) - [chore(lint): improve eslint rules](https://github.com/log4js-node/log4js-node/pull/1206) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps): updated dependencies](https://github.com/log4js-node/log4js-node/pull/1207) - thanks [@lamweili](https://github.com/lamweili) - chore(deps-dev): bump eslint from 8.10.0 to 8.11.0 - chore(deps-dev): bump eslint-config-airbnb-base from 13.2.0 to 15.0.0 - chore(deps-dev): bump eslint-config-prettier from 8.4.0 to 8.5.0 - chore(deps-dev): bump tap from 15.1.6 to 16.0.0 - chore(deps): bump date-format from 4.0.4 to 4.0.5 - chore(deps): bump streamroller from 3.0.4 to 3.0.5 - chore(deps): updated package-lock.json ## 6.4.2 - [fix: fileSync appender to create directory recursively](https://github.com/log4js-node/log4js-node/pull/1191) - thanks [@lamweili](https://github.com/lamweili) - [fix: serialise() for NaN, Infinity, -Infinity and undefined](https://github.com/log4js-node/log4js-node/pull/1188) - thanks [@lamweili](https://github.com/lamweili) - [fix: connectLogger not logging on close](https://github.com/log4js-node/log4js-node/pull/1179) - thanks [@lamweili](https://github.com/lamweili) - [refactor: defensive coding](https://github.com/log4js-node/log4js-node/pull/1183) - thanks [@lamweili](https://github.com/lamweili) - [type: fixed Logger constructor](https://github.com/log4js-node/log4js-node/pull/1177) - thanks [@lamweili](https://github.com/lamweili) - [test: improve test coverage](https://github.com/log4js-node/log4js-node/pull/1184) - thanks [@lamweili](https://github.com/lamweili) - [test: refactor and replaced tap deprecation in preparation for tap v15](https://github.com/log4js-node/log4js-node/pull/1172) - thanks [@lamweili](https://github.com/lamweili) - [test: added e2e test for multiprocess Appender](https://github.com/log4js-node/log4js-node/pull/1170) - thanks [@nicojs](https://github.com/nicojs) - [docs: updated file appender docs](https://github.com/log4js-node/log4js-node/pull/1182) - thanks [@lamweili](https://github.com/lamweili) - [docs: updated dateFile appender docs](https://github.com/log4js-node/log4js-node/pull/1181) - thanks [@lamweili](https://github.com/lamweili) - [docs: corrected typo in sample code for multiFile appender](https://github.com/log4js-node/log4js-node/pull/1180) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps): updated deps-dev](https://github.com/log4js-node/log4js-node/pull/1194) - thanks [@lamweili](https://github.com/lamweili) - chore(deps): bump date-format from 4.0.3 to 4.0.4 - chore(deps): bump streamroller from 3.0.2 to 3.0.4 - fix: [#1189](https://github.com/log4js-node/log4js-node/issues/1189) for an compatibility issue with directory creation for NodeJS < 10.12.0 ([streamroller@3.0.3 changelog](https://github.com/log4js-node/streamroller/blob/master/CHANGELOG.md)) - chore(deps-dev): bump eslint from 8.8.0 to 8.10.0 - chore(deps-dev): bump eslint-config-prettier from 8.3.0 to 8.4.0 - chore(deps-dev): bump fs-extra from 10.0.0 to 10.0.1 - chore(deps-dev): bump typescript from 4.5.5 to 4.6.2 - [chore(deps): updated deps-dev](https://github.com/log4js-node/log4js-node/pull/1185) - thanks [@lamweili](https://github.com/lamweili) - chore(deps): bump flatted from 3.2.4 to 3.2.5 - chore(deps-dev): bump eslint from 8.7.0 to 8.8.0 - [chore(deps): updated package-lock.json](https://github.com/log4js-node/log4js-node/pull/1174) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps-dev): bump tap from 14.10.7 to 15.1.6](https://github.com/log4js-node/log4js-node/pull/1173) - thanks [@lamweili](https://github.com/lamweili) ## 6.4.1 - [fix: startup multiprocess even when no direct appenders](https://github.com/log4js-node/log4js-node/pull/1162) - thanks [@nicojs](https://github.com/nicojs) - [refactor: fixed eslint warnings](https://github.com/log4js-node/log4js-node/pull/1165) - thanks [@lamweili](https://github.com/lamweili) - [refactor: additional alias for date patterns](https://github.com/log4js-node/log4js-node/pull/1163) - thanks [@lamweili](https://github.com/lamweili) - [refactor: added emitWarning for deprecation](https://github.com/log4js-node/log4js-node/pull/1164) - thanks [@lamweili](https://github.com/lamweili) - [type: Fixed wrong types from 6.4.0 regression](https://github.com/log4js-node/log4js-node/pull/1158) - thanks [@glasser](https://github.com/glasser) - [docs: changed author to contributors in package.json](https://github.com/log4js-node/log4js-node/pull/1153) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps): bump node-fetch from 2.6.6 to 2.6.7](https://github.com/log4js-node/log4js-node/pull/1167) - thanks [@Dependabot](https://github.com/dependabot) - [chore(deps-dev): bump typescript from 4.5.4 to 4.5.5](https://github.com/log4js-node/log4js-node/pull/1166) - thanks [@lamweili](https://github.com/lamweili) ## 6.4.0 - BREAKING CHANGE 💥 New default file permissions may cause external applications unable to read logs. A [manual code/configuration change](https://github.com/log4js-node/log4js-node/pull/1141#issuecomment-1076224470) is required. - [feat: added warnings when log() is used with invalid levels before fallbacking to INFO](https://github.com/log4js-node/log4js-node/pull/1062) - thanks [@abernh](https://github.com/abernh) - [feat: exposed Recording](https://github.com/log4js-node/log4js-node/pull/1103) - thanks [@polo-language](https://github.com/polo-language) - [fix: default file permission to be 0o600 instead of 0o644](https://github.com/log4js-node/log4js-node/pull/1141) - thanks [ranjit-git](https://www.huntr.dev/users/ranjit-git) and [@lamweili](https://github.com/lamweili) - [docs: updated fileSync.md and misc comments](https://github.com/log4js-node/log4js-node/pull/1148) - thanks [@lamweili](https://github.com/lamweili) - [fix: file descriptor leak if repeated configure()](https://github.com/log4js-node/log4js-node/pull/1113) - thanks [@lamweili](https://github.com/lamweili) - [fix: MaxListenersExceededWarning from NodeJS](https://github.com/log4js-node/log4js-node/pull/1110) - thanks [@lamweili](https://github.com/lamweili) - [test: added assertion for increase of SIGHUP listeners on log4js.configure()](https://github.com/log4js-node/log4js-node/pull/1142) - thanks [@lamweili](https://github.com/lamweili) - [fix: missing TCP appender with Webpack and Typescript](https://github.com/log4js-node/log4js-node/pull/1028) - thanks [@techmunk](https://github.com/techmunk) - [fix: dateFile appender exiting NodeJS on error](https://github.com/log4js-node/log4js-node/pull/1097) - thanks [@4eb0da](https://github.com/4eb0da) - [refactor: using writer.writable instead of alive for checking](https://github.com/log4js-node/log4js-node/pull/1144) - thanks [@lamweili](https://github.com/lamweili) - [fix: TCP appender exiting NodeJS on error](https://github.com/log4js-node/log4js-node/pull/1089) - thanks [@jhonatanTeixeira](https://github.com/jhonatanTeixeira) - [fix: multiprocess appender exiting NodeJS on error](https://github.com/log4js-node/log4js-node/pull/529) - thanks [@harlentan](https://github.com/harlentan) - [test: update fakeFS.read as graceful-fs uses it](https://github.com/log4js-node/log4js-node/pull/1127) - thanks [@lamweili](https://github.com/lamweili) - [test: update fakeFS.realpath as fs-extra uses it](https://github.com/log4js-node/log4js-node/pull/1128) - thanks [@lamweili](https://github.com/lamweili) - test: added tap.tearDown() to clean up test files - test: [#1143](https://github.com/log4js-node/log4js-node/pull/1143) - thanks [@lamweili](https://github.com/lamweili) - test: [#1022](https://github.com/log4js-node/log4js-node/pull/1022) - thanks [@abetomo](https://github.com/abetomo) - [type: improved @types for AppenderModule](https://github.com/log4js-node/log4js-node/pull/1079) - thanks [@nicobao](https://github.com/nicobao) - [type: Updated fileSync appender types](https://github.com/log4js-node/log4js-node/pull/1116) - thanks [@lamweili](https://github.com/lamweili) - [type: Removed erroneous type in file appender](https://github.com/log4js-node/log4js-node/pull/1031) - thanks [@vdmtrv](https://github.com/vdmtrv) - [type: Updated Logger.log type](https://github.com/log4js-node/log4js-node/pull/1115) - thanks [@ZLundqvist](https://github.com/ZLundqvist) - [type: Updated Logger.\_log type](https://github.com/log4js-node/log4js-node/pull/1117) - thanks [@lamweili](https://github.com/lamweili) - [type: Updated Logger.level type](https://github.com/log4js-node/log4js-node/pull/1118) - thanks [@lamweili](https://github.com/lamweili) - [type: Updated Levels.getLevel type](https://github.com/log4js-node/log4js-node/pull/1072) - thanks [@saulzhong](https://github.com/saulzhong) - [chore(deps): bump streamroller from 3.0.1 to 3.0.2](https://github.com/log4js-node/log4js-node/pull/1147) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps): bump date-format from 4.0.2 to 4.0.3](https://github.com/log4js-node/log4js-node/pull/1146) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps-dev): bump eslint from from 8.6.0 to 8.7.0](https://github.com/log4js-node/log4js-node/pull/1145) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps-dev): bump nyc from 14.1.1 to 15.1.0](https://github.com/log4js-node/log4js-node/pull/1140) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps-dev): bump eslint from 5.16.0 to 8.6.0](https://github.com/log4js-node/log4js-node/pull/1138) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps): bump flatted from 2.0.2 to 3.2.4](https://github.com/log4js-node/log4js-node/pull/1137) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps-dev): bump fs-extra from 8.1.0 to 10.0.0](https://github.com/log4js-node/log4js-node/pull/1136) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps): bump streamroller from 2.2.4 to 3.0.1](https://github.com/log4js-node/log4js-node/pull/1135) - thanks [@lamweili](https://github.com/lamweili) - [fix: compressed file ignores dateFile appender "mode"](https://github.com/log4js-node/streamroller/pull/65) - thanks [@rnd-debug](https://github.com/rnd-debug) - fix: [#1039](https://github.com/log4js-node/log4js-node/issues/1039) where there is an additional separator in filename ([streamroller@3.0.0 changelog](https://github.com/log4js-node/streamroller/blob/master/CHANGELOG.md)) - fix: [#1035](https://github.com/log4js-node/log4js-node/issues/1035), [#1080](https://github.com/log4js-node/log4js-node/issues/1080) for daysToKeep naming confusion ([streamroller@3.0.0 changelog](https://github.com/log4js-node/streamroller/blob/master/CHANGELOG.md)) - [refactor: migrated from daysToKeep to numBackups due to streamroller@^3.0.0](https://github.com/log4js-node/log4js-node/pull/1149) - thanks [@lamweili](https://github.com/lamweili) - [feat: allows for zero backups](https://github.com/log4js-node/log4js-node/pull/1151) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps): bump date-format from 3.0.0 to 4.0.2](https://github.com/log4js-node/log4js-node/pull/1134) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps): updated dependencies](https://github.com/log4js-node/log4js-node/pull/1130) - thanks [@lamweili](https://github.com/lamweili) - chore(deps-dev): bump eslint-config-prettier from 6.15.0 to 8.3.0 - chore(deps-dev): bump eslint-plugin-prettier from 3.4.1 to 4.0.0 - chore(deps-dev): bump husky from 3.1.0 to 7.0.4 - chore(deps-dev): bump prettier from 1.19.0 to 2.5.1 - chore(deps-dev): bump typescript from 3.9.10 to 4.5.4 - [chore(deps-dev): bump eslint-config-prettier from 6.15.0 to 8.3.0](https://github.com/log4js-node/log4js-node/pull/1129) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps): updated dependencies](https://github.com/log4js-node/log4js-node/pull/1121) - thanks [@lamweili](https://github.com/lamweili) - chore(deps-dev): bump codecov from 3.6.1 to 3.8.3 - chore(deps-dev): bump eslint-config-prettier from 6.5.0 to 6.15.0 - chore(deps-dev): bump eslint-import-resolver-node from 0.3.2 to 0.3.6 - chore(deps-dev): bump eslint-plugin-import" from 2.18.2 to 2.25.4 - chore(deps-dev): bump eslint-plugin-prettier from 3.1.1 to 3.4.1 - chore(deps-dev): bump husky from 3.0.9 to 3.1.0 - chore(deps-dev): bump prettier from 1.18.2 to 1.19.1 - chore(deps-dev): bump typescript from 3.7.2 to 3.9.10 - [chore(deps): bump path-parse from 1.0.6 to 1.0.7](https://github.com/log4js-node/log4js-node/pull/1120) - thanks [@Dependabot](https://github.com/dependabot) - [chore(deps): bump glob-parent from 5.1.1 to 5.1.2](https://github.com/log4js-node/log4js-node/pull/1084) - thanks [@Dependabot](https://github.com/dependabot) - [chore(deps): bump hosted-git-info from 2.7.1 to 2.8.9](https://github.com/log4js-node/log4js-node/pull/1076) - thanks [@Dependabot](https://github.com/dependabot) - [chore(deps): bump lodash from 4.17.14 to 4.17.21](https://github.com/log4js-node/log4js-node/pull/1075) - thanks [@Dependabot](https://github.com/dependabot) - [chore(deps): bump y18n from 4.0.0 to 4.0.1](https://github.com/log4js-node/log4js-node/pull/1070) - thanks [@Dependabot](https://github.com/dependabot) - [chore(deps): bump node-fetch from 2.6.0 to 2.6.1](https://github.com/log4js-node/log4js-node/pull/1047) - thanks [@Dependabot](https://github.com/dependabot) - [chore(deps): bump yargs-parser from 13.1.1 to 13.1.2](https://github.com/log4js-node/log4js-node/pull/1045) - thanks [@Dependabot](https://github.com/dependabot) - [chore(deps-dev): bump codecov from 3.6.5 to 3.7.1](https://github.com/log4js-node/log4js-node/pull/1033) - thanks [@Dependabot](https://github.com/dependabot) ## 6.3.0 - [Add option to file appender to remove ANSI colours](https://github.com/log4js-node/log4js-node/pull/1001) - thanks [@BlueCocoa](https://github.com/BlueCocoa) - [Do not create appender if no categories use it](https://github.com/log4js-node/log4js-node/pull/1002) - thanks [@rnd-debug](https://github.com/rnd-debug) - [Docs: better categories inheritance description](https://github.com/log4js-node/log4js-node/pull/1003) - thanks [@rnd-debug](https://github.com/rnd-debug) - [Better jsdoc docs](https://github.com/log4js-node/log4js-node/pull/1004) - thanks [@wataash](https://github.com/wataash) - [Typescript: access category field in Logger](https://github.com/log4js-node/log4js-node/pull/1006) - thanks [@rtvd](https://github.com/rtvd) - [Docs: influxdb appender](https://github.com/log4js-node/log4js-node/pull/1014) - thanks [@rnd-debug](https://github.com/rnd-debug) - [Support for fileSync appender in webpack](https://github.com/log4js-node/log4js-node/pull/1015) - thanks [@lauren-li](https://github.com/lauren-li) - [Docs: UDP appender](https://github.com/log4js-node/log4js-node/pull/1018) - thanks [@iassasin](https://github.com/iassasin) - [Style: spaces and tabs](https://github.com/log4js-node/log4js-node/pull/1016) - thanks [@abetomo](https://github.com/abetomo) ## 6.2.1 - [Update streamroller to 2.2.4 to fix incorrect filename matching during log rotation](https://github.com/log4js-node/log4js-node/pull/996) ## 6.2.0 - [Add custom message end token to TCP appender](https://github.com/log4js-node/log4js-node/pull/994) - thanks [@rnd-debug](https://github.com/rnd-debug) - [Update acorn (dev dep of a dep)](https://github.com/log4js-node/log4js-node/pull/992) - thanks Github Robots. ## 6.1.2 - [Handle out-of-order appender loading](https://github.com/log4js-node/log4js-node/pull/986) - thanks [@mvastola](https://github.com/mvastola) ## 6.1.1 - [Add guards for undefined shutdown callback](https://github.com/log4js-node/log4js-node/pull/972) - thanks [@aaron-edwards](https://github.com/aaron-edwards) - [Ignore .bob files](https://github.com/log4js-node/log4js-node/pull/975) - thanks [@cesine](https://github.com/cesine) - [Add mark method to type definitions](https://github.com/log4js-node/log4js-node/pull/984) - thanks [@techmunk](https://github.com/techmunk) ## 6.1.0 - [Add pause event to dateFile appender](https://github.com/log4js-node/log4js-node/pull/965) - thanks [@shayantabatabaee](https://github.com/shayantabatabaee) - [Add pause event to file appender](https://github.com/log4js-node/log4js-node/pull/938) - thanks [@shayantabatabaee](https://github.com/shayantabatabaee) - [Add pause/resume event to docs](https://github.com/log4js-node/log4js-node/pull/966) ## 6.0.0 - [Update streamroller to fix unhandled promise rejection](https://github.com/log4js-node/log4js-node/pull/962) - [Updated date-format library](https://github.com/log4js-node/log4js-node/pull/960) ## 5.3.0 - [Padding and truncation changes](https://github.com/log4js-node/log4js-node/pull/956) ## 5.2.2 - [Update streamroller to fix overwriting old files when using date rolling](https://github.com/log4js-node/log4js-node/pull/951) ## 5.2.1 - [Update streamroller to fix numToKeep not working with dateFile pattern that is all digits](https://github.com/log4js-node/log4js-node/pull/949) ## 5.2.0 - [Update streamroller to 2.2.0 (copy and truncate when file is busy)](https://github.com/log4js-node/log4js-node/pull/948) ## 5.1.0 - [Update streamroller to 2.1.0 (windows fixes)](https://github.com/log4js-node/log4js-node/pull/933) ## 5.0.0 - [Update streamroller to 2.0.0 (remove support for node v6)](https://github.com/log4js-node/log4js-node/pull/922) - [Update dependencies (mostly dev deps)](https://github.com/log4js-node/log4js-node/pull/923) - [Fix error when cluster not available](https://github.com/log4js-node/log4js-node/pull/930) - [Test coverage improvements](https://github.com/log4js-node/log4js-node/pull/925) ## 4.5.1 - [Update streamroller 1.0.5 -> 1.0.6 (to fix overwriting old backup log files)](https://github.com/log4js-node/log4js-node/pull/918) - [Dependency update: lodash 4.17.4 (dependency of a dependency, not log4js)](https://github.com/log4js-node/log4js-node/pull/917) - thanks Github Automated Security Thing. - [Dependency update: lodash 4.4.0 -> 4.5.0 (dependency of a dependency, not log4js)](https://github.com/log4js-node/log4js-node/pull/915) - thanks Github Automated Security Thing. ## 4.5.0 - [Override call stack parsing](https://github.com/log4js-node/log4js-node/pull/914) - thanks [@rommni](https://github.com/rommni) - [patternLayout filename depth token](https://github.com/log4js-node/log4js-node/pull/913) - thanks [@rommni](https://github.com/rommni) ## 4.4.0 - [Add option to pass appender module in config](https://github.com/log4js-node/log4js-node/pull/833) - thanks [@kaxelson](https://github.com/kaxelson) - [Added docs for passing appender module](https://github.com/log4js-node/log4js-node/pull/904) - [Updated dependencies](https://github.com/log4js-node/log4js-node/pull/900) ## 4.3.2 - [Types for enableCallStack](https://github.com/log4js-node/log4js-node/pull/897) - thanks [@citrusjunoss](https://github.com/citrusjunoss) ## 4.3.1 - [Fix for maxLogSize in dateFile appender](https://github.com/log4js-node/log4js-node/pull/889) ## 4.3.0 - [Feature: line number support](https://github.com/log4js-node/log4js-node/pull/879) - thanks [@victor0801x](https://github.com/victor0801x) - [Fix for missing core appenders in webpack](https://github.com/log4js-node/log4js-node/pull/882) ## 4.2.0 - [Feature: add appender and level inheritance](https://github.com/log4js-node/log4js-node/pull/863) - thanks [@pharapiak](https://github.com/pharapiak) - [Feature: add response to context for connectLogger](https://github.com/log4js-node/log4js-node/pull/862) - thanks [@leak4mk0](https://github.com/leak4mk0) - [Fix for broken sighup handler](https://github.com/log4js-node/log4js-node/pull/873) - [Add missing types for Level](https://github.com/log4js-node/log4js-node/pull/872) - thanks [@Ivkaa](https://github.com/Ivkaa) - [Typescript fixes for connect logger context](https://github.com/log4js-node/log4js-node/pull/876) - thanks [@leak4mk0](https://github.com/leak4mk0) - [Upgrade to streamroller-1.0.5 to fix log rotation bug](https://github.com/log4js-node/log4js-node/pull/878) ## 4.1.1 - [Various test fixes for node v12](https://github.com/log4js-node/log4js-node/pull/870) - [Fix layout problem in node v12](https://github.com/log4js-node/log4js-node/pull/860) - thanks [@bjornstar](https://github.com/bjornstar) - [Add missing types for addLevels](https://github.com/log4js-node/log4js-node/pull/867) - thanks [@Ivkaa](https://github.com/Ivkaa) - [Allow any return type for layout function](https://github.com/log4js-node/log4js-node/pull/845) - thanks [@xinbenlv](https://github.com/xinbenlv) ## 4.1.0 - Updated streamroller to 1.0.4, to fix a bug where the inital size of an existing file was ignored when appending - [Updated streamroller to 1.0.3](https://github.com/log4js-node/log4js-node/pull/841), to fix a crash bug if the date pattern was all digits. - [Updated dependencies](https://github.com/log4js-node/log4js-node/pull/840) ## Previous versions Change information for older versions can be found by looking at the milestones in github.
# log4js-node Changelog ## 6.6.1 - [fix: connectlogger nolog function](https://github.com/log4js-node/log4js-node/pull/1285) - thanks [@eyoboue](https://github.com/eyoboue) - [type: corrected AppenderModule interface and Recording interface](https://github.com/log4js-node/log4js-node/pull/1304) - thanks [@lamweili](https://github.com/lamweili) - test: extended timeout interval for OS operations - thanks [@lamweili](https://github.com/lamweili) - test: [#1306](https://github.com/log4js-node/log4js-node/pull/1306) - test: [#1297](https://github.com/log4js-node/log4js-node/pull/1297) - [test: support older Node.js versions](https://github.com/log4js-node/log4js-node/pull/1295) - thanks [@lamweili](https://github.com/lamweili) - [ci: added tests for Node.js 8.x](https://github.com/log4js-node/log4js-node/pull/1303) - thanks [@lamweili](https://github.com/lamweili) - [ci: added tests for Node.js 10.x, 18.x](https://github.com/log4js-node/log4js-node/pull/1301) - thanks [@lamweili](https://github.com/lamweili) - [ci: updated codeql from v1 to v2](https://github.com/log4js-node/log4js-node/pull/1302) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps-dev): bump lodash from 4.17.19 to 4.17.21](https://github.com/log4js-node/log4js-node/pull/1309) - thanks [@Dependabot](https://github.com/dependabot) - [chore(deps-dev): bump path-parse from 1.0.6 to 1.0.7](https://github.com/log4js-node/log4js-node/pull/1308) - thanks [@Dependabot](https://github.com/dependabot) - [chore(deps-dev): downgraded nyc from 15.1.0 to 14.1.1](https://github.com/log4js-node/log4js-node/pull/1305) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps): updated dependencies](https://github.com/log4js-node/log4js-node/pull/1296) - thanks [@lamweili](https://github.com/lamweili) - chore(deps): bump date-format from 4.0.11 to 4.0.13 - chore(deps): bump flatted from 3.2.5 to 3.2.6 - chore(deps): bump streamroller from 3.1.1 to 3.1.2 - chore(deps-dev): bump @commitlint/cli from 17.0.2 to 17.0.3 - chore(deps-dev): bump @commitlint/config-conventional from 17.0.2 to 17.0.3 - [chore(deps-dev): bump eslint from 8.16.0 to 8.20.0](https://github.com/log4js-node/log4js-node/pull/1300) - chore(deps-dev): bump eslint-plugin-prettier from 4.0.0 to 4.2.1 - chore(deps-dev): bump prettier from 2.6.0 to 2.7.1 - chore(deps-dev): bump tap from 16.2.0 to 16.3.0 - chore(deps-dev): bump typescript from 4.7.2 to 4.7.4 - chore(deps): updated package-lock.json ## 6.6.0 - [feat: adding function(req, res) support to connectLogger nolog](https://github.com/log4js-node/log4js-node/pull/1279) - thanks [@eyoboue](https://github.com/eyoboue) - [fix: ability to load CJS appenders (through .cjs extension) for ESM packages](https://github.com/log4js-node/log4js-node/pull/1280) - thanks [@lamweili](https://github.com/lamweili) - [type: consistent typing for Logger](https://github.com/log4js-node/log4js-node/pull/1276) - thanks [@taozi0818](https://github.com/taozi0818) - [type: Make Appender Type extensible from other modules and the user](https://github.com/log4js-node/log4js-node/pull/1267) - thanks [@ZachHaber](https://github.com/ZachHaber) - [refactor: clearer logic for invalid level and LOG synonym](https://github.com/log4js-node/log4js-node/pull/1264) - thanks [@lamweili](https://github.com/lamweili) - [style: ran prettier and requires prettier for ci](https://github.com/log4js-node/log4js-node/pull/1271) - thanks [@ZachHaber](https://github.com/ZachHaber) - [docs: renamed peteriman to lamweili in changelog](https://github.com/log4js-node/log4js-node/pull/1272) - thanks [@lamweili](https://github.com/lamweili) - [ci: replaced validate-commit-msg, fixed husky config, removed codecov](https://github.com/log4js-node/log4js-node/pull/1274) - thanks [@ZachHaber](https://github.com/ZachHaber) - [chore(deps): updated dependencies](https://github.com/log4js-node/log4js-node/pull/1266) - thanks [@lamweili](https://github.com/lamweili) - chore(deps-dev): bump typescript from 4.6.4 to 4.7.2 - chore(deps): bump date-format from 4.0.10 to 4.0.11 - chore(deps): updated package-lock.json ## 6.5.2 - [type: add LogEvent.serialise](https://github.com/log4js-node/log4js-node/pull/1260) - thanks [@marrowleaves](https://github.com/marrowleaves) ## 6.5.1 - [fix: fs.appendFileSync should use flag instead of flags](https://github.com/log4js-node/log4js-node/pull/1257) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps): updated dependencies](https://github.com/log4js-node/log4js-node/pull/1258) - thanks [@lamweili](https://github.com/lamweili) - chore(deps): bump streamroller from 3.1.0 to 3.1.1 - chore(deps): updated package-lock.json ## 6.5.0 - [feat: logger.log() to be synonym of logger.info()](https://github.com/log4js-node/log4js-node/pull/1254) - thanks [@lamweili](https://github.com/lamweili) - [feat: tilde expansion for filename](https://github.com/log4js-node/log4js-node/pull/1252) - thanks [@lamweili](https://github.com/lamweili) - [fix: better file validation](https://github.com/log4js-node/log4js-node/pull/1251) - thanks [@lamweili](https://github.com/lamweili) - [fix: fallback for logger.log outputs nothing](https://github.com/log4js-node/log4js-node/pull/1247) - thanks [@lamweili](https://github.com/lamweili) - [docs: updated fileAppender maxLogSize documentation](https://github.com/log4js-node/log4js-node/pull/1248) - thanks [@lamweili](https://github.com/lamweili) - [ci: enforced 100% test coverage tests](https://github.com/log4js-node/log4js-node/pull/1253) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps): updated dependencies](https://github.com/log4js-node/log4js-node/pull/1256) - thanks [@lamweili](https://github.com/lamweili) - chore(deps-dev): bump eslint from 8.15.0 to 8.16.0 - chore(deps): bump streamroller from 3.0.9 to 3.1.0 - chore(deps): updated package-lock.json ## 6.4.7 - [fix: dateFileAppender unable to use units in maxLogSize](https://github.com/log4js-node/log4js-node/pull/1243) - thanks [@lamweili](https://github.com/lamweili) - [type: added fileNameSep for FileAppender and DateFileAppender](https://github.com/log4js-node/log4js-node/pull/1241) - thanks [@lamweili](https://github.com/lamweili) - [docs: updated usage of units for maxLogSize](https://github.com/log4js-node/log4js-node/pull/1242) - thanks [@lamweili](https://github.com/lamweili) - [docs: updated comments in typescript def](https://github.com/log4js-node/log4js-node/pull/1240) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps): updated dependencies](https://github.com/log4js-node/log4js-node/pull/1244) - thanks [@lamweili](https://github.com/lamweili) - chore(deps-dev): bump eslint from 8.14.0 to 8.15.0 - chore(deps-dev): bump husky from 7.0.4 to 8.0.1 - chore(deps-dev): bump tap from 16.1.0 to 16.2.0 - chore(deps-dev): bump typescript from 4.6.3 to 4.6.4 - chore(deps): bump date-format from 4.0.9 to 4.0.10 - chore(deps): bump streamroller from 3.0.8 to 3.0.9 - chore(deps): updated package-lock.json - [chore(deps): updated dependencies](https://github.com/log4js-node/log4js-node/pull/1238) - thanks [@lamweili](https://github.com/lamweili) - chore(deps-dev): bump tap from 16.0.1 to 16.1.0 - chore(deps-dev): updated package-lock.json ## 6.4.6 - [chore(deps): updated dependencies](https://github.com/log4js-node/log4js-node/pull/1236) - thanks [@lamweili](https://github.com/lamweili) - chore(deps-dev): bump eslint from 8.13.0 to 8.14.0 - chore(deps): bump date-format from 4.0.7 to 4.0.9 - chore(deps): bump streamroller from 3.0.7 to 3.0.8 - fix: [#1216](https://github.com/log4js-node/log4js-node/issues/1216) where promise rejection is not handled ([streamroller@3.0.8 changelog](https://github.com/log4js-node/streamroller/blob/master/CHANGELOG.md)) - chore(deps): updated package-lock.json - [chore(deps): updated dependencies](https://github.com/log4js-node/log4js-node/pull/1234) - thanks [@lamweili](https://github.com/lamweili) - chore(deps): bump fs-extra from 10.0.1 to 10.1.0 - chore(deps): updated package-lock.json ## 6.4.5 - [fix: deserialise for enableCallStack features: filename, lineNumber, columnNumber, callStack](https://github.com/log4js-node/log4js-node/pull/1230) - thanks [@lamweili](https://github.com/lamweili) - [fix: fileDepth for ESM](https://github.com/log4js-node/log4js-node/pull/1224) - thanks [@lamweili](https://github.com/lamweili) - [refactor: replace deprecated String.prototype.substr()](https://github.com/log4js-node/log4js-node/pull/1223) - thanks [@CommanderRoot](https://github.com/CommanderRoot) - [type: LogEvent types](https://github.com/log4js-node/log4js-node/pull/1231) - thanks [@lamweili](https://github.com/lamweili) - [docs: updated typescript usage](https://github.com/log4js-node/log4js-node/pull/1229) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps): updated dependencies](https://github.com/log4js-node/log4js-node/pull/1232) - thanks [@lamweili](https://github.com/lamweili) - chore(deps): bump date-format from 4.0.6 to 4.0.7 - chore(deps): bump streamroller from 3.0.6 to 3.0.7 - fix: [#1225](https://github.com/log4js-node/log4js-node/issues/1225) where fs-extra throws error when fs.realpath.native is undefined ([streamroller@3.0.7 changelog](https://github.com/log4js-node/streamroller/blob/master/CHANGELOG.md)) - chore(deps): updated package-lock.json - [chore(deps): updated dependencies](https://github.com/log4js-node/log4js-node/pull/1228) - thanks [@lamweili](https://github.com/lamweili) - chore(deps-dev): bump eslint from 8.11.0 to 8.13.0 - chore(deps-dev): bump eslint-plugin-import from 2.25.4 to 2.26.0 - chore(deps-dev): bump tap from 16.0.0 to 16.0.1 - chore(deps-dev): bump typescript from 4.6.2 to 4.6.3 - chore(deps-dev): updated package-lock.json - [chore(deps-dev): bump minimist from 1.2.5 to 1.2.6](https://github.com/log4js-node/log4js-node/pull/1227) - thanks [@Dependabot](https://github.com/dependabot) ## 6.4.4 - [fix: set logger.level on runtime will no longer wrongly reset useCallStack](https://github.com/log4js-node/log4js-node/pull/1217) - thanks [@lamweili](https://github.com/lamweili) - [docs: updated docs for broken links and inaccessible pages](https://github.com/log4js-node/log4js-node/pull/1219) - thanks [@lamweili](https://github.com/lamweili) - [docs: broken link to gelf appender](https://github.com/log4js-node/log4js-node/pull/1218) - thanks [@mattalexx](https://github.com/mattalexx) - [docs: updated docs for appenders module loading](https://github.com/log4js-node/log4js-node/pull/985) - thanks [@leonimurilo](https://github.com/leonimurilo) - [chore(deps): updated dependencies](https://github.com/log4js-node/log4js-node/pull/1221) - thanks [@lamweili](https://github.com/lamweili) - chore(deps): bump streamroller from 3.0.5 to 3.0.6 - chore(deps): bump debug from 4.3.3 to 4.3.4 - chore(deps): bump date-format from 4.0.5 to 4.0.6 - chore(deps-dev): bump prettier from 2.5.1 to 2.6.0 - chore(deps): updated package-lock.json ## 6.4.3 - [fix: added filename validation](https://github.com/log4js-node/log4js-node/pull/1201) - thanks [@lamweili](https://github.com/lamweili) - [refactor: do not initialise default appenders as it will be done again by configure()](https://github.com/log4js-node/log4js-node/pull/1210) - thanks [@lamweili](https://github.com/lamweili) - [refactor: defensive coding for cluster=null if require('cluster') fails in try-catch ](https://github.com/log4js-node/log4js-node/pull/1199) - thanks [@lamweili](https://github.com/lamweili) - [refactor: removed redundant logic in tcp-serverAppender](https://github.com/log4js-node/log4js-node/pull/1198) - thanks [@lamweili](https://github.com/lamweili) - [refactor: removed redundant logic in multiprocessAppender](https://github.com/log4js-node/log4js-node/pull/1197) - thanks [@lamweili](https://github.com/lamweili) - test: 100% test coverage - thanks [@lamweili](https://github.com/lamweili) - test: part 1 of 3: [#1200](https://github.com/log4js-node/log4js-node/pull/1200) - test: part 2 of 3: [#1204](https://github.com/log4js-node/log4js-node/pull/1204) - test: part 3 of 3: [#1205](https://github.com/log4js-node/log4js-node/pull/1205) - [test: improved test cases](https://github.com/log4js-node/log4js-node/pull/1211) - [docs: updated README.md with badges](https://github.com/log4js-node/log4js-node/pull/1209) - thanks [@lamweili](https://github.com/lamweili) - [docs: added docs for istanbul ignore](https://github.com/log4js-node/log4js-node/pull/1208) - thanks [@lamweili](https://github.com/lamweili) - [docs: updated logger api docs](https://github.com/log4js-node/log4js-node/pull/1203) - thanks [@lamweili](https://github.com/lamweili) - [docs: updated file and fileSync appender docs](https://github.com/log4js-node/log4js-node/pull/1202) - thanks [@lamweili](https://github.com/lamweili) - [chore(lint): improve eslint rules](https://github.com/log4js-node/log4js-node/pull/1206) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps): updated dependencies](https://github.com/log4js-node/log4js-node/pull/1207) - thanks [@lamweili](https://github.com/lamweili) - chore(deps-dev): bump eslint from 8.10.0 to 8.11.0 - chore(deps-dev): bump eslint-config-airbnb-base from 13.2.0 to 15.0.0 - chore(deps-dev): bump eslint-config-prettier from 8.4.0 to 8.5.0 - chore(deps-dev): bump tap from 15.1.6 to 16.0.0 - chore(deps): bump date-format from 4.0.4 to 4.0.5 - chore(deps): bump streamroller from 3.0.4 to 3.0.5 - chore(deps): updated package-lock.json ## 6.4.2 - [fix: fileSync appender to create directory recursively](https://github.com/log4js-node/log4js-node/pull/1191) - thanks [@lamweili](https://github.com/lamweili) - [fix: serialise() for NaN, Infinity, -Infinity and undefined](https://github.com/log4js-node/log4js-node/pull/1188) - thanks [@lamweili](https://github.com/lamweili) - [fix: connectLogger not logging on close](https://github.com/log4js-node/log4js-node/pull/1179) - thanks [@lamweili](https://github.com/lamweili) - [refactor: defensive coding](https://github.com/log4js-node/log4js-node/pull/1183) - thanks [@lamweili](https://github.com/lamweili) - [type: fixed Logger constructor](https://github.com/log4js-node/log4js-node/pull/1177) - thanks [@lamweili](https://github.com/lamweili) - [test: improve test coverage](https://github.com/log4js-node/log4js-node/pull/1184) - thanks [@lamweili](https://github.com/lamweili) - [test: refactor and replaced tap deprecation in preparation for tap v15](https://github.com/log4js-node/log4js-node/pull/1172) - thanks [@lamweili](https://github.com/lamweili) - [test: added e2e test for multiprocess Appender](https://github.com/log4js-node/log4js-node/pull/1170) - thanks [@nicojs](https://github.com/nicojs) - [docs: updated file appender docs](https://github.com/log4js-node/log4js-node/pull/1182) - thanks [@lamweili](https://github.com/lamweili) - [docs: updated dateFile appender docs](https://github.com/log4js-node/log4js-node/pull/1181) - thanks [@lamweili](https://github.com/lamweili) - [docs: corrected typo in sample code for multiFile appender](https://github.com/log4js-node/log4js-node/pull/1180) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps): updated deps-dev](https://github.com/log4js-node/log4js-node/pull/1194) - thanks [@lamweili](https://github.com/lamweili) - chore(deps): bump date-format from 4.0.3 to 4.0.4 - chore(deps): bump streamroller from 3.0.2 to 3.0.4 - fix: [#1189](https://github.com/log4js-node/log4js-node/issues/1189) for an compatibility issue with directory creation for NodeJS < 10.12.0 ([streamroller@3.0.3 changelog](https://github.com/log4js-node/streamroller/blob/master/CHANGELOG.md)) - chore(deps-dev): bump eslint from 8.8.0 to 8.10.0 - chore(deps-dev): bump eslint-config-prettier from 8.3.0 to 8.4.0 - chore(deps-dev): bump fs-extra from 10.0.0 to 10.0.1 - chore(deps-dev): bump typescript from 4.5.5 to 4.6.2 - [chore(deps): updated deps-dev](https://github.com/log4js-node/log4js-node/pull/1185) - thanks [@lamweili](https://github.com/lamweili) - chore(deps): bump flatted from 3.2.4 to 3.2.5 - chore(deps-dev): bump eslint from 8.7.0 to 8.8.0 - [chore(deps): updated package-lock.json](https://github.com/log4js-node/log4js-node/pull/1174) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps-dev): bump tap from 14.10.7 to 15.1.6](https://github.com/log4js-node/log4js-node/pull/1173) - thanks [@lamweili](https://github.com/lamweili) ## 6.4.1 - [fix: startup multiprocess even when no direct appenders](https://github.com/log4js-node/log4js-node/pull/1162) - thanks [@nicojs](https://github.com/nicojs) - [refactor: fixed eslint warnings](https://github.com/log4js-node/log4js-node/pull/1165) - thanks [@lamweili](https://github.com/lamweili) - [refactor: additional alias for date patterns](https://github.com/log4js-node/log4js-node/pull/1163) - thanks [@lamweili](https://github.com/lamweili) - [refactor: added emitWarning for deprecation](https://github.com/log4js-node/log4js-node/pull/1164) - thanks [@lamweili](https://github.com/lamweili) - [type: Fixed wrong types from 6.4.0 regression](https://github.com/log4js-node/log4js-node/pull/1158) - thanks [@glasser](https://github.com/glasser) - [docs: changed author to contributors in package.json](https://github.com/log4js-node/log4js-node/pull/1153) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps): bump node-fetch from 2.6.6 to 2.6.7](https://github.com/log4js-node/log4js-node/pull/1167) - thanks [@Dependabot](https://github.com/dependabot) - [chore(deps-dev): bump typescript from 4.5.4 to 4.5.5](https://github.com/log4js-node/log4js-node/pull/1166) - thanks [@lamweili](https://github.com/lamweili) ## 6.4.0 - BREAKING CHANGE 💥 New default file permissions may cause external applications unable to read logs. A [manual code/configuration change](https://github.com/log4js-node/log4js-node/pull/1141#issuecomment-1076224470) is required. - [feat: added warnings when log() is used with invalid levels before fallbacking to INFO](https://github.com/log4js-node/log4js-node/pull/1062) - thanks [@abernh](https://github.com/abernh) - [feat: exposed Recording](https://github.com/log4js-node/log4js-node/pull/1103) - thanks [@polo-language](https://github.com/polo-language) - [fix: default file permission to be 0o600 instead of 0o644](https://github.com/log4js-node/log4js-node/pull/1141) - thanks [ranjit-git](https://www.huntr.dev/users/ranjit-git) and [@lamweili](https://github.com/lamweili) - [docs: updated fileSync.md and misc comments](https://github.com/log4js-node/log4js-node/pull/1148) - thanks [@lamweili](https://github.com/lamweili) - [fix: file descriptor leak if repeated configure()](https://github.com/log4js-node/log4js-node/pull/1113) - thanks [@lamweili](https://github.com/lamweili) - [fix: MaxListenersExceededWarning from NodeJS](https://github.com/log4js-node/log4js-node/pull/1110) - thanks [@lamweili](https://github.com/lamweili) - [test: added assertion for increase of SIGHUP listeners on log4js.configure()](https://github.com/log4js-node/log4js-node/pull/1142) - thanks [@lamweili](https://github.com/lamweili) - [fix: missing TCP appender with Webpack and Typescript](https://github.com/log4js-node/log4js-node/pull/1028) - thanks [@techmunk](https://github.com/techmunk) - [fix: dateFile appender exiting NodeJS on error](https://github.com/log4js-node/log4js-node/pull/1097) - thanks [@4eb0da](https://github.com/4eb0da) - [refactor: using writer.writable instead of alive for checking](https://github.com/log4js-node/log4js-node/pull/1144) - thanks [@lamweili](https://github.com/lamweili) - [fix: TCP appender exiting NodeJS on error](https://github.com/log4js-node/log4js-node/pull/1089) - thanks [@jhonatanTeixeira](https://github.com/jhonatanTeixeira) - [fix: multiprocess appender exiting NodeJS on error](https://github.com/log4js-node/log4js-node/pull/529) - thanks [@harlentan](https://github.com/harlentan) - [test: update fakeFS.read as graceful-fs uses it](https://github.com/log4js-node/log4js-node/pull/1127) - thanks [@lamweili](https://github.com/lamweili) - [test: update fakeFS.realpath as fs-extra uses it](https://github.com/log4js-node/log4js-node/pull/1128) - thanks [@lamweili](https://github.com/lamweili) - test: added tap.tearDown() to clean up test files - test: [#1143](https://github.com/log4js-node/log4js-node/pull/1143) - thanks [@lamweili](https://github.com/lamweili) - test: [#1022](https://github.com/log4js-node/log4js-node/pull/1022) - thanks [@abetomo](https://github.com/abetomo) - [type: improved @types for AppenderModule](https://github.com/log4js-node/log4js-node/pull/1079) - thanks [@nicobao](https://github.com/nicobao) - [type: Updated fileSync appender types](https://github.com/log4js-node/log4js-node/pull/1116) - thanks [@lamweili](https://github.com/lamweili) - [type: Removed erroneous type in file appender](https://github.com/log4js-node/log4js-node/pull/1031) - thanks [@vdmtrv](https://github.com/vdmtrv) - [type: Updated Logger.log type](https://github.com/log4js-node/log4js-node/pull/1115) - thanks [@ZLundqvist](https://github.com/ZLundqvist) - [type: Updated Logger.\_log type](https://github.com/log4js-node/log4js-node/pull/1117) - thanks [@lamweili](https://github.com/lamweili) - [type: Updated Logger.level type](https://github.com/log4js-node/log4js-node/pull/1118) - thanks [@lamweili](https://github.com/lamweili) - [type: Updated Levels.getLevel type](https://github.com/log4js-node/log4js-node/pull/1072) - thanks [@saulzhong](https://github.com/saulzhong) - [chore(deps): bump streamroller from 3.0.1 to 3.0.2](https://github.com/log4js-node/log4js-node/pull/1147) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps): bump date-format from 4.0.2 to 4.0.3](https://github.com/log4js-node/log4js-node/pull/1146) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps-dev): bump eslint from from 8.6.0 to 8.7.0](https://github.com/log4js-node/log4js-node/pull/1145) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps-dev): bump nyc from 14.1.1 to 15.1.0](https://github.com/log4js-node/log4js-node/pull/1140) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps-dev): bump eslint from 5.16.0 to 8.6.0](https://github.com/log4js-node/log4js-node/pull/1138) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps): bump flatted from 2.0.2 to 3.2.4](https://github.com/log4js-node/log4js-node/pull/1137) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps-dev): bump fs-extra from 8.1.0 to 10.0.0](https://github.com/log4js-node/log4js-node/pull/1136) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps): bump streamroller from 2.2.4 to 3.0.1](https://github.com/log4js-node/log4js-node/pull/1135) - thanks [@lamweili](https://github.com/lamweili) - [fix: compressed file ignores dateFile appender "mode"](https://github.com/log4js-node/streamroller/pull/65) - thanks [@rnd-debug](https://github.com/rnd-debug) - fix: [#1039](https://github.com/log4js-node/log4js-node/issues/1039) where there is an additional separator in filename ([streamroller@3.0.0 changelog](https://github.com/log4js-node/streamroller/blob/master/CHANGELOG.md)) - fix: [#1035](https://github.com/log4js-node/log4js-node/issues/1035), [#1080](https://github.com/log4js-node/log4js-node/issues/1080) for daysToKeep naming confusion ([streamroller@3.0.0 changelog](https://github.com/log4js-node/streamroller/blob/master/CHANGELOG.md)) - [refactor: migrated from daysToKeep to numBackups due to streamroller@^3.0.0](https://github.com/log4js-node/log4js-node/pull/1149) - thanks [@lamweili](https://github.com/lamweili) - [feat: allows for zero backups](https://github.com/log4js-node/log4js-node/pull/1151) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps): bump date-format from 3.0.0 to 4.0.2](https://github.com/log4js-node/log4js-node/pull/1134) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps): updated dependencies](https://github.com/log4js-node/log4js-node/pull/1130) - thanks [@lamweili](https://github.com/lamweili) - chore(deps-dev): bump eslint-config-prettier from 6.15.0 to 8.3.0 - chore(deps-dev): bump eslint-plugin-prettier from 3.4.1 to 4.0.0 - chore(deps-dev): bump husky from 3.1.0 to 7.0.4 - chore(deps-dev): bump prettier from 1.19.0 to 2.5.1 - chore(deps-dev): bump typescript from 3.9.10 to 4.5.4 - [chore(deps-dev): bump eslint-config-prettier from 6.15.0 to 8.3.0](https://github.com/log4js-node/log4js-node/pull/1129) - thanks [@lamweili](https://github.com/lamweili) - [chore(deps): updated dependencies](https://github.com/log4js-node/log4js-node/pull/1121) - thanks [@lamweili](https://github.com/lamweili) - chore(deps-dev): bump codecov from 3.6.1 to 3.8.3 - chore(deps-dev): bump eslint-config-prettier from 6.5.0 to 6.15.0 - chore(deps-dev): bump eslint-import-resolver-node from 0.3.2 to 0.3.6 - chore(deps-dev): bump eslint-plugin-import" from 2.18.2 to 2.25.4 - chore(deps-dev): bump eslint-plugin-prettier from 3.1.1 to 3.4.1 - chore(deps-dev): bump husky from 3.0.9 to 3.1.0 - chore(deps-dev): bump prettier from 1.18.2 to 1.19.1 - chore(deps-dev): bump typescript from 3.7.2 to 3.9.10 - [chore(deps): bump path-parse from 1.0.6 to 1.0.7](https://github.com/log4js-node/log4js-node/pull/1120) - thanks [@Dependabot](https://github.com/dependabot) - [chore(deps): bump glob-parent from 5.1.1 to 5.1.2](https://github.com/log4js-node/log4js-node/pull/1084) - thanks [@Dependabot](https://github.com/dependabot) - [chore(deps): bump hosted-git-info from 2.7.1 to 2.8.9](https://github.com/log4js-node/log4js-node/pull/1076) - thanks [@Dependabot](https://github.com/dependabot) - [chore(deps): bump lodash from 4.17.14 to 4.17.21](https://github.com/log4js-node/log4js-node/pull/1075) - thanks [@Dependabot](https://github.com/dependabot) - [chore(deps): bump y18n from 4.0.0 to 4.0.1](https://github.com/log4js-node/log4js-node/pull/1070) - thanks [@Dependabot](https://github.com/dependabot) - [chore(deps): bump node-fetch from 2.6.0 to 2.6.1](https://github.com/log4js-node/log4js-node/pull/1047) - thanks [@Dependabot](https://github.com/dependabot) - [chore(deps): bump yargs-parser from 13.1.1 to 13.1.2](https://github.com/log4js-node/log4js-node/pull/1045) - thanks [@Dependabot](https://github.com/dependabot) - [chore(deps-dev): bump codecov from 3.6.5 to 3.7.1](https://github.com/log4js-node/log4js-node/pull/1033) - thanks [@Dependabot](https://github.com/dependabot) ## 6.3.0 - [Add option to file appender to remove ANSI colours](https://github.com/log4js-node/log4js-node/pull/1001) - thanks [@BlueCocoa](https://github.com/BlueCocoa) - [Do not create appender if no categories use it](https://github.com/log4js-node/log4js-node/pull/1002) - thanks [@rnd-debug](https://github.com/rnd-debug) - [Docs: better categories inheritance description](https://github.com/log4js-node/log4js-node/pull/1003) - thanks [@rnd-debug](https://github.com/rnd-debug) - [Better jsdoc docs](https://github.com/log4js-node/log4js-node/pull/1004) - thanks [@wataash](https://github.com/wataash) - [Typescript: access category field in Logger](https://github.com/log4js-node/log4js-node/pull/1006) - thanks [@rtvd](https://github.com/rtvd) - [Docs: influxdb appender](https://github.com/log4js-node/log4js-node/pull/1014) - thanks [@rnd-debug](https://github.com/rnd-debug) - [Support for fileSync appender in webpack](https://github.com/log4js-node/log4js-node/pull/1015) - thanks [@lauren-li](https://github.com/lauren-li) - [Docs: UDP appender](https://github.com/log4js-node/log4js-node/pull/1018) - thanks [@iassasin](https://github.com/iassasin) - [Style: spaces and tabs](https://github.com/log4js-node/log4js-node/pull/1016) - thanks [@abetomo](https://github.com/abetomo) ## 6.2.1 - [Update streamroller to 2.2.4 to fix incorrect filename matching during log rotation](https://github.com/log4js-node/log4js-node/pull/996) ## 6.2.0 - [Add custom message end token to TCP appender](https://github.com/log4js-node/log4js-node/pull/994) - thanks [@rnd-debug](https://github.com/rnd-debug) - [Update acorn (dev dep of a dep)](https://github.com/log4js-node/log4js-node/pull/992) - thanks Github Robots. ## 6.1.2 - [Handle out-of-order appender loading](https://github.com/log4js-node/log4js-node/pull/986) - thanks [@mvastola](https://github.com/mvastola) ## 6.1.1 - [Add guards for undefined shutdown callback](https://github.com/log4js-node/log4js-node/pull/972) - thanks [@aaron-edwards](https://github.com/aaron-edwards) - [Ignore .bob files](https://github.com/log4js-node/log4js-node/pull/975) - thanks [@cesine](https://github.com/cesine) - [Add mark method to type definitions](https://github.com/log4js-node/log4js-node/pull/984) - thanks [@techmunk](https://github.com/techmunk) ## 6.1.0 - [Add pause event to dateFile appender](https://github.com/log4js-node/log4js-node/pull/965) - thanks [@shayantabatabaee](https://github.com/shayantabatabaee) - [Add pause event to file appender](https://github.com/log4js-node/log4js-node/pull/938) - thanks [@shayantabatabaee](https://github.com/shayantabatabaee) - [Add pause/resume event to docs](https://github.com/log4js-node/log4js-node/pull/966) ## 6.0.0 - [Update streamroller to fix unhandled promise rejection](https://github.com/log4js-node/log4js-node/pull/962) - [Updated date-format library](https://github.com/log4js-node/log4js-node/pull/960) ## 5.3.0 - [Padding and truncation changes](https://github.com/log4js-node/log4js-node/pull/956) ## 5.2.2 - [Update streamroller to fix overwriting old files when using date rolling](https://github.com/log4js-node/log4js-node/pull/951) ## 5.2.1 - [Update streamroller to fix numToKeep not working with dateFile pattern that is all digits](https://github.com/log4js-node/log4js-node/pull/949) ## 5.2.0 - [Update streamroller to 2.2.0 (copy and truncate when file is busy)](https://github.com/log4js-node/log4js-node/pull/948) ## 5.1.0 - [Update streamroller to 2.1.0 (windows fixes)](https://github.com/log4js-node/log4js-node/pull/933) ## 5.0.0 - [Update streamroller to 2.0.0 (remove support for node v6)](https://github.com/log4js-node/log4js-node/pull/922) - [Update dependencies (mostly dev deps)](https://github.com/log4js-node/log4js-node/pull/923) - [Fix error when cluster not available](https://github.com/log4js-node/log4js-node/pull/930) - [Test coverage improvements](https://github.com/log4js-node/log4js-node/pull/925) ## 4.5.1 - [Update streamroller 1.0.5 -> 1.0.6 (to fix overwriting old backup log files)](https://github.com/log4js-node/log4js-node/pull/918) - [Dependency update: lodash 4.17.4 (dependency of a dependency, not log4js)](https://github.com/log4js-node/log4js-node/pull/917) - thanks Github Automated Security Thing. - [Dependency update: lodash 4.4.0 -> 4.5.0 (dependency of a dependency, not log4js)](https://github.com/log4js-node/log4js-node/pull/915) - thanks Github Automated Security Thing. ## 4.5.0 - [Override call stack parsing](https://github.com/log4js-node/log4js-node/pull/914) - thanks [@rommni](https://github.com/rommni) - [patternLayout filename depth token](https://github.com/log4js-node/log4js-node/pull/913) - thanks [@rommni](https://github.com/rommni) ## 4.4.0 - [Add option to pass appender module in config](https://github.com/log4js-node/log4js-node/pull/833) - thanks [@kaxelson](https://github.com/kaxelson) - [Added docs for passing appender module](https://github.com/log4js-node/log4js-node/pull/904) - [Updated dependencies](https://github.com/log4js-node/log4js-node/pull/900) ## 4.3.2 - [Types for enableCallStack](https://github.com/log4js-node/log4js-node/pull/897) - thanks [@citrusjunoss](https://github.com/citrusjunoss) ## 4.3.1 - [Fix for maxLogSize in dateFile appender](https://github.com/log4js-node/log4js-node/pull/889) ## 4.3.0 - [Feature: line number support](https://github.com/log4js-node/log4js-node/pull/879) - thanks [@victor0801x](https://github.com/victor0801x) - [Fix for missing core appenders in webpack](https://github.com/log4js-node/log4js-node/pull/882) ## 4.2.0 - [Feature: add appender and level inheritance](https://github.com/log4js-node/log4js-node/pull/863) - thanks [@pharapiak](https://github.com/pharapiak) - [Feature: add response to context for connectLogger](https://github.com/log4js-node/log4js-node/pull/862) - thanks [@leak4mk0](https://github.com/leak4mk0) - [Fix for broken sighup handler](https://github.com/log4js-node/log4js-node/pull/873) - [Add missing types for Level](https://github.com/log4js-node/log4js-node/pull/872) - thanks [@Ivkaa](https://github.com/Ivkaa) - [Typescript fixes for connect logger context](https://github.com/log4js-node/log4js-node/pull/876) - thanks [@leak4mk0](https://github.com/leak4mk0) - [Upgrade to streamroller-1.0.5 to fix log rotation bug](https://github.com/log4js-node/log4js-node/pull/878) ## 4.1.1 - [Various test fixes for node v12](https://github.com/log4js-node/log4js-node/pull/870) - [Fix layout problem in node v12](https://github.com/log4js-node/log4js-node/pull/860) - thanks [@bjornstar](https://github.com/bjornstar) - [Add missing types for addLevels](https://github.com/log4js-node/log4js-node/pull/867) - thanks [@Ivkaa](https://github.com/Ivkaa) - [Allow any return type for layout function](https://github.com/log4js-node/log4js-node/pull/845) - thanks [@xinbenlv](https://github.com/xinbenlv) ## 4.1.0 - Updated streamroller to 1.0.4, to fix a bug where the inital size of an existing file was ignored when appending - [Updated streamroller to 1.0.3](https://github.com/log4js-node/log4js-node/pull/841), to fix a crash bug if the date pattern was all digits. - [Updated dependencies](https://github.com/log4js-node/log4js-node/pull/840) ## Previous versions Change information for older versions can be found by looking at the milestones in github.
-1
log4js-node/log4js-node
1,333
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one
lamweili
"2022-10-01T17:05:08Z"
"2022-10-01T17:20:06Z"
570ef530dc02d3e843a5421cb015bb8fadfe0b41
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one.
./.prettierignore
**/.* coverage
**/.* coverage
-1
log4js-node/log4js-node
1,333
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one
lamweili
"2022-10-01T17:05:08Z"
"2022-10-01T17:20:06Z"
570ef530dc02d3e843a5421cb015bb8fadfe0b41
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one.
./examples/cluster.js
'use strict'; const cluster = require('cluster'); const log4js = require('../lib/log4js'); log4js.configure({ appenders: { out: { type: 'stdout' }, }, categories: { default: { appenders: ['out'], level: 'debug' } }, }); let logger; if (cluster.isMaster) { logger = log4js.getLogger('master'); cluster.fork(); logger.info('master is done', process.pid, new Error('flaps')); } else { logger = log4js.getLogger('worker'); logger.info("I'm a worker, with pid ", process.pid, new Error('pants')); logger.info("I'm a worker, with pid ", process.pid, new Error()); logger.info('cluster.worker ', cluster.worker); cluster.worker.disconnect(); }
'use strict'; const cluster = require('cluster'); const log4js = require('../lib/log4js'); log4js.configure({ appenders: { out: { type: 'stdout' }, }, categories: { default: { appenders: ['out'], level: 'debug' } }, }); let logger; if (cluster.isMaster) { logger = log4js.getLogger('master'); cluster.fork(); logger.info('master is done', process.pid, new Error('flaps')); } else { logger = log4js.getLogger('worker'); logger.info("I'm a worker, with pid ", process.pid, new Error('pants')); logger.info("I'm a worker, with pid ", process.pid, new Error()); logger.info('cluster.worker ', cluster.worker); cluster.worker.disconnect(); }
-1
log4js-node/log4js-node
1,333
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one
lamweili
"2022-10-01T17:05:08Z"
"2022-10-01T17:20:06Z"
570ef530dc02d3e843a5421cb015bb8fadfe0b41
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one.
./docs/tcp.md
# TCP Appender The TCP appender sends log events to a master server over TCP sockets. It can be used as a simple way to centralise logging when you have multiple servers or processes. It uses the node.js core networking modules, and so does not require any extra dependencies. Remember to call `log4js.shutdown` when your application terminates, so that the sockets get closed cleanly. It's designed to work with the [tcp-server](tcp-server.md), but it doesn't necessarily have to, just make sure whatever is listening at the other end is expecting JSON objects as strings. ## Configuration - `type` - `tcp` - `port` - `integer` (optional, defaults to `5000`) - the port to send to - `host` - `string` (optional, defaults to `localhost`) - the host/IP address to send to - `endMsg` - `string` (optional, defaults to `__LOG4JS__`) - the delimiter that marks the end of a log message - `layout` - `object` (optional, defaults to a serialized log event) - see [layouts](layouts.md) ## Example ```javascript log4js.configure({ appenders: { network: { type: "tcp", host: "log.server" }, }, categories: { default: { appenders: ["network"], level: "error" }, }, }); ``` This will send all error messages to `log.server:5000`.
# TCP Appender The TCP appender sends log events to a master server over TCP sockets. It can be used as a simple way to centralise logging when you have multiple servers or processes. It uses the node.js core networking modules, and so does not require any extra dependencies. Remember to call `log4js.shutdown` when your application terminates, so that the sockets get closed cleanly. It's designed to work with the [tcp-server](tcp-server.md), but it doesn't necessarily have to, just make sure whatever is listening at the other end is expecting JSON objects as strings. ## Configuration - `type` - `tcp` - `port` - `integer` (optional, defaults to `5000`) - the port to send to - `host` - `string` (optional, defaults to `localhost`) - the host/IP address to send to - `endMsg` - `string` (optional, defaults to `__LOG4JS__`) - the delimiter that marks the end of a log message - `layout` - `object` (optional, defaults to a serialized log event) - see [layouts](layouts.md) ## Example ```javascript log4js.configure({ appenders: { network: { type: "tcp", host: "log.server" }, }, categories: { default: { appenders: ["network"], level: "error" }, }, }); ``` This will send all error messages to `log.server:5000`.
-1
log4js-node/log4js-node
1,333
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one
lamweili
"2022-10-01T17:05:08Z"
"2022-10-01T17:20:06Z"
570ef530dc02d3e843a5421cb015bb8fadfe0b41
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one.
./test/tap/pause-test.js
const tap = require('tap'); const fs = require('fs'); const log4js = require('../../lib/log4js'); const removeFiles = async (filenames) => { if (!Array.isArray(filenames)) filenames = [filenames]; const promises = filenames.map((filename) => fs.promises.unlink(filename)); await Promise.allSettled(promises); }; tap.test('Drain event test', (batch) => { batch.test( 'Should emit pause event and resume when logging in a file with high frequency', (t) => { t.teardown(async () => { process.off( 'log4js:pause', process.listeners('log4js:pause')[ process.listeners('log4js:pause').length - 1 ] ); await removeFiles('logs/drain.log'); }); // Generate logger with 5k of highWaterMark config log4js.configure({ appenders: { file: { type: 'file', filename: 'logs/drain.log', highWaterMark: 5 * 1024, }, }, categories: { default: { appenders: ['file'], level: 'debug' }, }, }); let paused = false; let resumed = false; process.on('log4js:pause', (value) => { if (value) { paused = true; t.ok(value, 'log4js:pause, true'); } else { resumed = true; t.ok(!value, 'log4js:pause, false'); t.end(); } }); const logger = log4js.getLogger(); while (!paused && !resumed) { if (!paused) { logger.info('This is a test for emitting drain event'); } } } ); batch.test( 'Should emit pause event and resume when logging in a date file with high frequency', (t) => { t.teardown(async () => { process.off( 'log4js:pause', process.listeners('log4js:pause')[ process.listeners('log4js:pause').length - 1 ] ); await removeFiles('logs/date-file-drain.log'); }); // Generate date file logger with 5kb of highWaterMark config log4js.configure({ appenders: { file: { type: 'dateFile', filename: 'logs/date-file-drain.log', highWaterMark: 5 * 1024, }, }, categories: { default: { appenders: ['file'], level: 'debug' }, }, }); let paused = false; let resumed = false; process.on('log4js:pause', (value) => { if (value) { paused = true; t.ok(value, 'log4js:pause, true'); } else { resumed = true; t.ok(!value, 'log4js:pause, false'); t.end(); } }); const logger = log4js.getLogger(); while (!paused && !resumed) { if (!paused) logger.info( 'This is a test for emitting drain event in date file logger' ); } } ); batch.teardown(async () => { try { const files = fs.readdirSync('logs'); await removeFiles(files.map((filename) => `logs/${filename}`)); fs.rmdirSync('logs'); } catch (e) { // doesn't matter } }); batch.end(); });
const tap = require('tap'); const fs = require('fs'); const log4js = require('../../lib/log4js'); const removeFiles = async (filenames) => { if (!Array.isArray(filenames)) filenames = [filenames]; const promises = filenames.map((filename) => fs.promises.unlink(filename)); await Promise.allSettled(promises); }; tap.test('Drain event test', (batch) => { batch.test( 'Should emit pause event and resume when logging in a file with high frequency', (t) => { t.teardown(async () => { process.off( 'log4js:pause', process.listeners('log4js:pause')[ process.listeners('log4js:pause').length - 1 ] ); await removeFiles('logs/drain.log'); }); // Generate logger with 5k of highWaterMark config log4js.configure({ appenders: { file: { type: 'file', filename: 'logs/drain.log', highWaterMark: 5 * 1024, }, }, categories: { default: { appenders: ['file'], level: 'debug' }, }, }); let paused = false; let resumed = false; process.on('log4js:pause', (value) => { if (value) { paused = true; t.ok(value, 'log4js:pause, true'); } else { resumed = true; t.ok(!value, 'log4js:pause, false'); t.end(); } }); const logger = log4js.getLogger(); while (!paused && !resumed) { if (!paused) { logger.info('This is a test for emitting drain event'); } } } ); batch.test( 'Should emit pause event and resume when logging in a date file with high frequency', (t) => { t.teardown(async () => { process.off( 'log4js:pause', process.listeners('log4js:pause')[ process.listeners('log4js:pause').length - 1 ] ); await removeFiles('logs/date-file-drain.log'); }); // Generate date file logger with 5kb of highWaterMark config log4js.configure({ appenders: { file: { type: 'dateFile', filename: 'logs/date-file-drain.log', highWaterMark: 5 * 1024, }, }, categories: { default: { appenders: ['file'], level: 'debug' }, }, }); let paused = false; let resumed = false; process.on('log4js:pause', (value) => { if (value) { paused = true; t.ok(value, 'log4js:pause, true'); } else { resumed = true; t.ok(!value, 'log4js:pause, false'); t.end(); } }); const logger = log4js.getLogger(); while (!paused && !resumed) { if (!paused) logger.info( 'This is a test for emitting drain event in date file logger' ); } } ); batch.teardown(async () => { try { const files = fs.readdirSync('logs'); await removeFiles(files.map((filename) => `logs/${filename}`)); fs.rmdirSync('logs'); } catch (e) { // doesn't matter } }); batch.end(); });
-1
log4js-node/log4js-node
1,333
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one
lamweili
"2022-10-01T17:05:08Z"
"2022-10-01T17:20:06Z"
570ef530dc02d3e843a5421cb015bb8fadfe0b41
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one.
./examples/example-connect-logger.js
// The connect/express logger was added to log4js by danbell. This allows connect/express servers to log using log4js. // https://github.com/nomiddlename/log4js-node/wiki/Connect-Logger // load modules const log4js = require('log4js'); const express = require('express'); const app = express(); // config log4js.configure({ appenders: { console: { type: 'console' }, file: { type: 'file', filename: 'logs/log4jsconnect.log' }, }, categories: { default: { appenders: ['console'], level: 'debug' }, log4jslog: { appenders: ['file'], level: 'debug' }, }, }); // define logger const logger = log4js.getLogger('log4jslog'); // set at which time msg is logged print like: only on error & above // logger.setLevel('ERROR'); // express app app.use(express.favicon('')); // app.use(log4js.connectLogger(logger, { level: log4js.levels.INFO })); // app.use(log4js.connectLogger(logger, { level: 'auto', format: ':method :url :status' })); // ### AUTO LEVEL DETECTION // http responses 3xx, level = WARN // http responses 4xx & 5xx, level = ERROR // else.level = INFO app.use(log4js.connectLogger(logger, { level: 'auto' })); // route app.get('/', (req, res) => { res.send('hello world'); }); // start app app.listen(5000); console.log('server runing at localhost:5000'); console.log('Simulation of normal response: goto localhost:5000'); console.log('Simulation of error response: goto localhost:5000/xxx');
// The connect/express logger was added to log4js by danbell. This allows connect/express servers to log using log4js. // https://github.com/nomiddlename/log4js-node/wiki/Connect-Logger // load modules const log4js = require('log4js'); const express = require('express'); const app = express(); // config log4js.configure({ appenders: { console: { type: 'console' }, file: { type: 'file', filename: 'logs/log4jsconnect.log' }, }, categories: { default: { appenders: ['console'], level: 'debug' }, log4jslog: { appenders: ['file'], level: 'debug' }, }, }); // define logger const logger = log4js.getLogger('log4jslog'); // set at which time msg is logged print like: only on error & above // logger.setLevel('ERROR'); // express app app.use(express.favicon('')); // app.use(log4js.connectLogger(logger, { level: log4js.levels.INFO })); // app.use(log4js.connectLogger(logger, { level: 'auto', format: ':method :url :status' })); // ### AUTO LEVEL DETECTION // http responses 3xx, level = WARN // http responses 4xx & 5xx, level = ERROR // else.level = INFO app.use(log4js.connectLogger(logger, { level: 'auto' })); // route app.get('/', (req, res) => { res.send('hello world'); }); // start app app.listen(5000); console.log('server runing at localhost:5000'); console.log('Simulation of normal response: goto localhost:5000'); console.log('Simulation of error response: goto localhost:5000/xxx');
-1
log4js-node/log4js-node
1,333
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one
lamweili
"2022-10-01T17:05:08Z"
"2022-10-01T17:20:06Z"
570ef530dc02d3e843a5421cb015bb8fadfe0b41
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one.
./types/test.ts
import * as log4js from './log4js'; log4js.configure('./filename'); const logger1 = log4js.getLogger(); logger1.level = 'debug'; logger1.debug('Some debug messages'); logger1.fatal({ whatever: 'foo', }); const logger3 = log4js.getLogger('cheese'); logger3.trace('Entering cheese testing'); logger3.debug('Got cheese.'); logger3.info('Cheese is Gouda.'); logger3.warn('Cheese is quite smelly.'); logger3.error('Cheese is too ripe!'); logger3.fatal('Cheese was breeding ground for listeria.'); log4js.configure({ appenders: { cheese: { type: 'console', filename: 'cheese.log' } }, categories: { default: { appenders: ['cheese'], level: 'error' } }, }); log4js.configure({ appenders: { out: { type: 'file', filename: 'pm2logs.log' }, }, categories: { default: { appenders: ['out'], level: 'info' }, }, pm2: true, pm2InstanceVar: 'INSTANCE_ID', }); log4js.addLayout( 'json', (config) => function (logEvent) { return JSON.stringify(logEvent) + config.separator; } ); log4js.configure({ appenders: { out: { type: 'stdout', layout: { type: 'json', separator: ',' } }, }, categories: { default: { appenders: ['out'], level: 'info' }, }, }); log4js.configure({ appenders: { file: { type: 'dateFile', filename: 'thing.log', pattern: '.mm' }, }, categories: { default: { appenders: ['file'], level: 'debug' }, }, }); const logger4 = log4js.getLogger('thing'); logger4.log('logging a thing'); const logger5 = log4js.getLogger('json-test'); logger5.info('this is just a test'); logger5.error('of a custom appender'); logger5.warn('that outputs json'); log4js.shutdown(); log4js.configure({ appenders: { cheeseLogs: { type: 'file', filename: 'cheese.log' }, console: { type: 'console' }, }, categories: { cheese: { appenders: ['cheeseLogs'], level: 'error' }, another: { appenders: ['console'], level: 'trace' }, default: { appenders: ['console', 'cheeseLogs'], level: 'trace' }, }, }); const logger6 = log4js.getLogger('cheese'); // only errors and above get logged. const otherLogger = log4js.getLogger(); // this will get coloured output on console, and appear in cheese.log otherLogger.error('AAArgh! Something went wrong', { some: 'otherObject', useful_for: 'debug purposes', }); otherLogger.log('This should appear as info output'); // these will not appear (logging level beneath error) logger6.trace('Entering cheese testing'); logger6.debug('Got cheese.'); logger6.info('Cheese is Gouda.'); logger6.log('Something funny about cheese.'); logger6.warn('Cheese is quite smelly.'); // these end up only in cheese.log logger6.error('Cheese %s is too ripe!', 'gouda'); logger6.fatal('Cheese was breeding ground for listeria.'); // these don't end up in cheese.log, but will appear on the console const anotherLogger = log4js.getLogger('another'); anotherLogger.debug('Just checking'); // will also go to console and cheese.log, since that's configured for all categories const pantsLog = log4js.getLogger('pants'); pantsLog.debug('Something for pants'); import { configure, getLogger } from './log4js'; configure('./filename'); const logger2 = getLogger(); logger2.level = 'debug'; logger2.debug('Some debug messages'); configure({ appenders: { cheese: { type: 'file', filename: 'cheese.log' } }, categories: { default: { appenders: ['cheese'], level: 'error' } }, }); log4js.configure('./filename').getLogger(); const logger7 = log4js.getLogger(); logger7.level = 'debug'; logger7.debug('Some debug messages'); const levels: log4js.Levels = log4js.levels; const level: log4js.Level = levels.getLevel('info'); log4js.connectLogger(logger1, { format: ':x, :y', level: 'info', context: true, }); log4js.connectLogger(logger2, { format: (req, _res, format) => format( `:remote-addr - ${req.id} - ":method :url HTTP/:http-version" :status :content-length ":referrer" ":user-agent"` ), }); //support for passing in an appender module log4js.configure({ appenders: { thing: { type: { configure: () => () => {} } } }, categories: { default: { appenders: ['thing'], level: 'debug' } }, }); declare module './log4js' { interface Appenders { StorageTestAppender: { type: 'storageTest'; storageMedium: 'dvd' | 'usb' | 'hdd'; }; } } log4js.configure({ appenders: { test: { type: 'storageTest', storageMedium: 'dvd' } }, categories: { default: { appenders: ['test'], level: 'debug' } }, }); log4js.configure({ appenders: { rec: { type: 'recording' } }, categories: { default: { appenders: ['rec'], level: 'debug' } }, }); const logger8 = log4js.getLogger(); logger8.level = 'debug'; logger8.debug('This will go to the recording!'); logger8.debug('Another one'); const recording = log4js.recording(); const loggingEvents = recording.playback(); if (loggingEvents.length !== 2) { throw new Error(`Expected 2 recorded events, got ${loggingEvents.length}`); } if (loggingEvents[0].data[0] !== 'This will go to the recording!') { throw new Error( `Expected message 'This will go to the recording!', got ${loggingEvents[0].data[0]}` ); } if (loggingEvents[1].data[0] !== 'Another one') { throw new Error( `Expected message 'Another one', got ${loggingEvents[1].data[0]}` ); } recording.reset(); const loggingEventsPostReset = recording.playback(); if (loggingEventsPostReset.length !== 0) { throw new Error( `Expected 0 recorded events after reset, got ${loggingEventsPostReset.length}` ); }
import * as log4js from './log4js'; log4js.configure('./filename'); const logger1 = log4js.getLogger(); logger1.level = 'debug'; logger1.debug('Some debug messages'); logger1.fatal({ whatever: 'foo', }); const logger3 = log4js.getLogger('cheese'); logger3.trace('Entering cheese testing'); logger3.debug('Got cheese.'); logger3.info('Cheese is Gouda.'); logger3.warn('Cheese is quite smelly.'); logger3.error('Cheese is too ripe!'); logger3.fatal('Cheese was breeding ground for listeria.'); log4js.configure({ appenders: { cheese: { type: 'console', filename: 'cheese.log' } }, categories: { default: { appenders: ['cheese'], level: 'error' } }, }); log4js.configure({ appenders: { out: { type: 'file', filename: 'pm2logs.log' }, }, categories: { default: { appenders: ['out'], level: 'info' }, }, pm2: true, pm2InstanceVar: 'INSTANCE_ID', }); log4js.addLayout( 'json', (config) => function (logEvent) { return JSON.stringify(logEvent) + config.separator; } ); log4js.configure({ appenders: { out: { type: 'stdout', layout: { type: 'json', separator: ',' } }, }, categories: { default: { appenders: ['out'], level: 'info' }, }, }); log4js.configure({ appenders: { file: { type: 'dateFile', filename: 'thing.log', pattern: '.mm' }, }, categories: { default: { appenders: ['file'], level: 'debug' }, }, }); const logger4 = log4js.getLogger('thing'); logger4.log('logging a thing'); const logger5 = log4js.getLogger('json-test'); logger5.info('this is just a test'); logger5.error('of a custom appender'); logger5.warn('that outputs json'); log4js.shutdown(); log4js.configure({ appenders: { cheeseLogs: { type: 'file', filename: 'cheese.log' }, console: { type: 'console' }, }, categories: { cheese: { appenders: ['cheeseLogs'], level: 'error' }, another: { appenders: ['console'], level: 'trace' }, default: { appenders: ['console', 'cheeseLogs'], level: 'trace' }, }, }); const logger6 = log4js.getLogger('cheese'); // only errors and above get logged. const otherLogger = log4js.getLogger(); // this will get coloured output on console, and appear in cheese.log otherLogger.error('AAArgh! Something went wrong', { some: 'otherObject', useful_for: 'debug purposes', }); otherLogger.log('This should appear as info output'); // these will not appear (logging level beneath error) logger6.trace('Entering cheese testing'); logger6.debug('Got cheese.'); logger6.info('Cheese is Gouda.'); logger6.log('Something funny about cheese.'); logger6.warn('Cheese is quite smelly.'); // these end up only in cheese.log logger6.error('Cheese %s is too ripe!', 'gouda'); logger6.fatal('Cheese was breeding ground for listeria.'); // these don't end up in cheese.log, but will appear on the console const anotherLogger = log4js.getLogger('another'); anotherLogger.debug('Just checking'); // will also go to console and cheese.log, since that's configured for all categories const pantsLog = log4js.getLogger('pants'); pantsLog.debug('Something for pants'); import { configure, getLogger } from './log4js'; configure('./filename'); const logger2 = getLogger(); logger2.level = 'debug'; logger2.debug('Some debug messages'); configure({ appenders: { cheese: { type: 'file', filename: 'cheese.log' } }, categories: { default: { appenders: ['cheese'], level: 'error' } }, }); log4js.configure('./filename').getLogger(); const logger7 = log4js.getLogger(); logger7.level = 'debug'; logger7.debug('Some debug messages'); const levels: log4js.Levels = log4js.levels; const level: log4js.Level = levels.getLevel('info'); log4js.connectLogger(logger1, { format: ':x, :y', level: 'info', context: true, }); log4js.connectLogger(logger2, { format: (req, _res, format) => format( `:remote-addr - ${req.id} - ":method :url HTTP/:http-version" :status :content-length ":referrer" ":user-agent"` ), }); //support for passing in an appender module log4js.configure({ appenders: { thing: { type: { configure: () => () => {} } } }, categories: { default: { appenders: ['thing'], level: 'debug' } }, }); declare module './log4js' { interface Appenders { StorageTestAppender: { type: 'storageTest'; storageMedium: 'dvd' | 'usb' | 'hdd'; }; } } log4js.configure({ appenders: { test: { type: 'storageTest', storageMedium: 'dvd' } }, categories: { default: { appenders: ['test'], level: 'debug' } }, }); log4js.configure({ appenders: { rec: { type: 'recording' } }, categories: { default: { appenders: ['rec'], level: 'debug' } }, }); const logger8 = log4js.getLogger(); logger8.level = 'debug'; logger8.debug('This will go to the recording!'); logger8.debug('Another one'); const recording = log4js.recording(); const loggingEvents = recording.playback(); if (loggingEvents.length !== 2) { throw new Error(`Expected 2 recorded events, got ${loggingEvents.length}`); } if (loggingEvents[0].data[0] !== 'This will go to the recording!') { throw new Error( `Expected message 'This will go to the recording!', got ${loggingEvents[0].data[0]}` ); } if (loggingEvents[1].data[0] !== 'Another one') { throw new Error( `Expected message 'Another one', got ${loggingEvents[1].data[0]}` ); } recording.reset(); const loggingEventsPostReset = recording.playback(); if (loggingEventsPostReset.length !== 0) { throw new Error( `Expected 0 recorded events after reset, got ${loggingEventsPostReset.length}` ); }
-1
log4js-node/log4js-node
1,333
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one
lamweili
"2022-10-01T17:05:08Z"
"2022-10-01T17:20:06Z"
570ef530dc02d3e843a5421cb015bb8fadfe0b41
cfbc7a08a6395a9c9bd6ceb9573a9ca786e137d7
refactor(LoggingEvent): loop through location keys instead of hard-coding one-by-one.
./examples/hipchat-appender.js
/** * !!! The hipchat-appender requires `hipchat-notifier` from npm, e.g. * - list as a dependency in your application's package.json || * - npm install hipchat-notifier */ const log4js = require('../lib/log4js'); log4js.configure({ appenders: { hipchat: { type: 'hipchat', hipchat_token: process.env.HIPCHAT_TOKEN || '< User token with Notification Privileges >', hipchat_room: process.env.HIPCHAT_ROOM || '< Room ID or Name >', }, }, categories: { default: { appenders: ['hipchat'], level: 'trace' }, }, }); const logger = log4js.getLogger('hipchat'); logger.warn('Test Warn message'); logger.info('Test Info message'); logger.debug('Test Debug Message'); logger.trace('Test Trace Message'); logger.fatal('Test Fatal Message'); logger.error('Test Error Message'); // alternative configuration demonstrating callback + custom layout // ///////////////////////////////////////////////////////////////// // use a custom layout function (in this case, the provided basicLayout) // format: [TIMESTAMP][LEVEL][category] - [message] log4js.configure({ appenders: { hipchat: { type: 'hipchat', hipchat_token: process.env.HIPCHAT_TOKEN || '< User token with Notification Privileges >', hipchat_room: process.env.HIPCHAT_ROOM || '< Room ID or Name >', hipchat_from: 'Mr. Semantics', hipchat_notify: false, hipchat_response_callback: function (err, response, body) { if (err || response.statusCode > 300) { throw new Error('hipchat-notifier failed'); } console.log('mr semantics callback success'); }, layout: { type: 'basic' }, }, }, categories: { default: { appenders: ['hipchat'], level: 'trace' } }, }); logger.info('Test customLayout from Mr. Semantics');
/** * !!! The hipchat-appender requires `hipchat-notifier` from npm, e.g. * - list as a dependency in your application's package.json || * - npm install hipchat-notifier */ const log4js = require('../lib/log4js'); log4js.configure({ appenders: { hipchat: { type: 'hipchat', hipchat_token: process.env.HIPCHAT_TOKEN || '< User token with Notification Privileges >', hipchat_room: process.env.HIPCHAT_ROOM || '< Room ID or Name >', }, }, categories: { default: { appenders: ['hipchat'], level: 'trace' }, }, }); const logger = log4js.getLogger('hipchat'); logger.warn('Test Warn message'); logger.info('Test Info message'); logger.debug('Test Debug Message'); logger.trace('Test Trace Message'); logger.fatal('Test Fatal Message'); logger.error('Test Error Message'); // alternative configuration demonstrating callback + custom layout // ///////////////////////////////////////////////////////////////// // use a custom layout function (in this case, the provided basicLayout) // format: [TIMESTAMP][LEVEL][category] - [message] log4js.configure({ appenders: { hipchat: { type: 'hipchat', hipchat_token: process.env.HIPCHAT_TOKEN || '< User token with Notification Privileges >', hipchat_room: process.env.HIPCHAT_ROOM || '< Room ID or Name >', hipchat_from: 'Mr. Semantics', hipchat_notify: false, hipchat_response_callback: function (err, response, body) { if (err || response.statusCode > 300) { throw new Error('hipchat-notifier failed'); } console.log('mr semantics callback success'); }, layout: { type: 'basic' }, }, }, categories: { default: { appenders: ['hipchat'], level: 'trace' } }, }); logger.info('Test customLayout from Mr. Semantics');
-1
log4js-node/log4js-node
1,332
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`
Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
lamweili
"2022-10-01T10:52:35Z"
"2022-10-01T10:57:05Z"
916eef11f1d2aa2f32765f956f1f674745feb8b6
570ef530dc02d3e843a5421cb015bb8fadfe0b41
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`. Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
./lib/LoggingEvent.js
const flatted = require('flatted'); const levels = require('./levels'); /** * @name LoggingEvent * @namespace Log4js */ class LoggingEvent { /** * Models a logging event. * @constructor * @param {string} categoryName name of category * @param {Log4js.Level} level level of message * @param {Array} data objects to log * @param {Error} [error] * @author Seth Chisamore */ constructor(categoryName, level, data, context, location, error) { this.startTime = new Date(); this.categoryName = categoryName; this.data = data; this.level = level; this.context = Object.assign({}, context); // eslint-disable-line prefer-object-spread this.pid = process.pid; this.error = error; if (location) { this.fileName = location.fileName; this.lineNumber = location.lineNumber; this.columnNumber = location.columnNumber; this.callStack = location.callStack; this.className = location.className; this.functionName = location.functionName; this.functionAlias = location.functionAlias; this.callerName = location.callerName; } } serialise() { return flatted.stringify(this, (key, value) => { // JSON.stringify(new Error('test')) returns {}, which is not really useful for us. // The following allows us to serialize errors (semi) correctly. if (value instanceof Error) { // eslint-disable-next-line prefer-object-spread value = Object.assign( { message: value.message, stack: value.stack }, value ); } // JSON.stringify({a: parseInt('abc'), b: 1/0, c: -1/0}) returns {a: null, b: null, c: null}. // The following allows us to serialize to NaN, Infinity and -Infinity correctly. else if ( typeof value === 'number' && (Number.isNaN(value) || !Number.isFinite(value)) ) { value = value.toString(); } // JSON.stringify([undefined]) returns [null]. // The following allows us to serialize to undefined correctly. else if (typeof value === 'undefined') { value = typeof value; } return value; }); } static deserialise(serialised) { let event; try { const rehydratedEvent = flatted.parse(serialised, (key, value) => { if (value && value.message && value.stack) { const fakeError = new Error(value); Object.keys(value).forEach((k) => { fakeError[k] = value[k]; }); value = fakeError; } return value; }); if ( rehydratedEvent.fileName || rehydratedEvent.lineNumber || rehydratedEvent.columnNumber || rehydratedEvent.callStack || rehydratedEvent.className || rehydratedEvent.functionName || rehydratedEvent.functionAlias || rehydratedEvent.callerName ) { rehydratedEvent.location = { fileName: rehydratedEvent.fileName, lineNumber: rehydratedEvent.lineNumber, columnNumber: rehydratedEvent.columnNumber, callStack: rehydratedEvent.callStack, className: rehydratedEvent.className, functionName: rehydratedEvent.functionName, functionAlias: rehydratedEvent.functionAlias, callerName: rehydratedEvent.callerName, }; } event = new LoggingEvent( rehydratedEvent.categoryName, levels.getLevel(rehydratedEvent.level.levelStr), rehydratedEvent.data, rehydratedEvent.context, rehydratedEvent.location, rehydratedEvent.error ); event.startTime = new Date(rehydratedEvent.startTime); event.pid = rehydratedEvent.pid; if (rehydratedEvent.cluster) { event.cluster = rehydratedEvent.cluster; } } catch (e) { event = new LoggingEvent('log4js', levels.ERROR, [ 'Unable to parse log:', serialised, 'because: ', e, ]); } return event; } } module.exports = LoggingEvent;
/* eslint max-classes-per-file: ["error", 2] */ const flatted = require('flatted'); const levels = require('./levels'); class SerDe { constructor() { const deserialise = { __LOG4JS_undefined__: undefined, __LOG4JS_NaN__: Number('abc'), __LOG4JS_Infinity__: 1 / 0, '__LOG4JS_-Infinity__': -1 / 0, }; this.deMap = deserialise; this.serMap = {}; Object.keys(this.deMap).forEach((key) => { const value = this.deMap[key]; this.serMap[value] = key; }); } canSerialise(key) { if (typeof key === 'string') return false; return key in this.serMap; } serialise(key) { if (this.canSerialise(key)) return this.serMap[key]; return key; } canDeserialise(key) { return key in this.deMap; } deserialise(key) { if (this.canDeserialise(key)) return this.deMap[key]; return key; } } const serde = new SerDe(); /** * @name LoggingEvent * @namespace Log4js */ class LoggingEvent { /** * Models a logging event. * @constructor * @param {string} categoryName name of category * @param {Log4js.Level} level level of message * @param {Array} data objects to log * @param {Error} [error] * @author Seth Chisamore */ constructor(categoryName, level, data, context, location, error) { this.startTime = new Date(); this.categoryName = categoryName; this.data = data; this.level = level; this.context = Object.assign({}, context); // eslint-disable-line prefer-object-spread this.pid = process.pid; this.error = error; if (location) { this.fileName = location.fileName; this.lineNumber = location.lineNumber; this.columnNumber = location.columnNumber; this.callStack = location.callStack; this.className = location.className; this.functionName = location.functionName; this.functionAlias = location.functionAlias; this.callerName = location.callerName; } } serialise() { return flatted.stringify(this, (key, value) => { // JSON.stringify(new Error('test')) returns {}, which is not really useful for us. // The following allows us to serialize errors (semi) correctly. if (value instanceof Error) { // eslint-disable-next-line prefer-object-spread value = Object.assign( { message: value.message, stack: value.stack }, value ); } // JSON.stringify({a: Number('abc'), b: 1/0, c: -1/0}) returns {a: null, b: null, c: null}. // The following allows us to serialize to NaN, Infinity and -Infinity correctly. // JSON.stringify([undefined]) returns [null]. // The following allows us to serialize to undefined correctly. return serde.serialise(value); }); } static deserialise(serialised) { let event; try { const rehydratedEvent = flatted.parse(serialised, (key, value) => { if (value && value.message && value.stack) { const fakeError = new Error(value); Object.keys(value).forEach((k) => { fakeError[k] = value[k]; }); value = fakeError; } return serde.deserialise(value); }); if ( rehydratedEvent.fileName || rehydratedEvent.lineNumber || rehydratedEvent.columnNumber || rehydratedEvent.callStack || rehydratedEvent.className || rehydratedEvent.functionName || rehydratedEvent.functionAlias || rehydratedEvent.callerName ) { rehydratedEvent.location = { fileName: rehydratedEvent.fileName, lineNumber: rehydratedEvent.lineNumber, columnNumber: rehydratedEvent.columnNumber, callStack: rehydratedEvent.callStack, className: rehydratedEvent.className, functionName: rehydratedEvent.functionName, functionAlias: rehydratedEvent.functionAlias, callerName: rehydratedEvent.callerName, }; } event = new LoggingEvent( rehydratedEvent.categoryName, levels.getLevel(rehydratedEvent.level.levelStr), rehydratedEvent.data, rehydratedEvent.context, rehydratedEvent.location, rehydratedEvent.error ); event.startTime = new Date(rehydratedEvent.startTime); event.pid = rehydratedEvent.pid; if (rehydratedEvent.cluster) { event.cluster = rehydratedEvent.cluster; } } catch (e) { event = new LoggingEvent('log4js', levels.ERROR, [ 'Unable to parse log:', serialised, 'because: ', e, ]); } return event; } } module.exports = LoggingEvent;
1
log4js-node/log4js-node
1,332
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`
Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
lamweili
"2022-10-01T10:52:35Z"
"2022-10-01T10:57:05Z"
916eef11f1d2aa2f32765f956f1f674745feb8b6
570ef530dc02d3e843a5421cb015bb8fadfe0b41
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`. Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
./lib/connect-logger.js
/* eslint no-underscore-dangle: ["error", { "allow": ["__statusCode", "_remoteAddress", "__headers", "_logging"] }] */ const levels = require('./levels'); const DEFAULT_FORMAT = ':remote-addr - -' + ' ":method :url HTTP/:http-version"' + ' :status :content-length ":referrer"' + ' ":user-agent"'; /** * Return request url path, * adding this function prevents the Cyclomatic Complexity, * for the assemble_tokens function at low, to pass the tests. * * @param {IncomingMessage} req * @return {string} * @api private */ function getUrl(req) { return req.originalUrl || req.url; } /** * Adds custom {token, replacement} objects to defaults, * overwriting the defaults if any tokens clash * * @param {IncomingMessage} req * @param {ServerResponse} res * @param {Array} customTokens * [{ token: string-or-regexp, replacement: string-or-replace-function }] * @return {Array} */ function assembleTokens(req, res, customTokens) { const arrayUniqueTokens = (array) => { const a = array.concat(); for (let i = 0; i < a.length; ++i) { for (let j = i + 1; j < a.length; ++j) { // not === because token can be regexp object // eslint-disable-next-line eqeqeq if (a[i].token == a[j].token) { a.splice(j--, 1); // eslint-disable-line no-plusplus } } } return a; }; const defaultTokens = []; defaultTokens.push({ token: ':url', replacement: getUrl(req) }); defaultTokens.push({ token: ':protocol', replacement: req.protocol }); defaultTokens.push({ token: ':hostname', replacement: req.hostname }); defaultTokens.push({ token: ':method', replacement: req.method }); defaultTokens.push({ token: ':status', replacement: res.__statusCode || res.statusCode, }); defaultTokens.push({ token: ':response-time', replacement: res.responseTime, }); defaultTokens.push({ token: ':date', replacement: new Date().toUTCString() }); defaultTokens.push({ token: ':referrer', replacement: req.headers.referer || req.headers.referrer || '', }); defaultTokens.push({ token: ':http-version', replacement: `${req.httpVersionMajor}.${req.httpVersionMinor}`, }); defaultTokens.push({ token: ':remote-addr', replacement: req.headers['x-forwarded-for'] || req.ip || req._remoteAddress || (req.socket && (req.socket.remoteAddress || (req.socket.socket && req.socket.socket.remoteAddress))), }); defaultTokens.push({ token: ':user-agent', replacement: req.headers['user-agent'], }); defaultTokens.push({ token: ':content-length', replacement: res.getHeader('content-length') || (res.__headers && res.__headers['Content-Length']) || '-', }); defaultTokens.push({ token: /:req\[([^\]]+)]/g, replacement(_, field) { return req.headers[field.toLowerCase()]; }, }); defaultTokens.push({ token: /:res\[([^\]]+)]/g, replacement(_, field) { return ( res.getHeader(field.toLowerCase()) || (res.__headers && res.__headers[field]) ); }, }); return arrayUniqueTokens(customTokens.concat(defaultTokens)); } /** * Return formatted log line. * * @param {string} str * @param {Array} tokens * @return {string} * @api private */ function format(str, tokens) { for (let i = 0; i < tokens.length; i++) { str = str.replace(tokens[i].token, tokens[i].replacement); } return str; } /** * Return RegExp Object about nolog * * @param {(string|Array)} nolog * @return {RegExp} * @api private * * syntax * 1. String * 1.1 "\\.gif" * NOT LOGGING http://example.com/hoge.gif and http://example.com/hoge.gif?fuga * LOGGING http://example.com/hoge.agif * 1.2 in "\\.gif|\\.jpg$" * NOT LOGGING http://example.com/hoge.gif and * http://example.com/hoge.gif?fuga and http://example.com/hoge.jpg?fuga * LOGGING http://example.com/hoge.agif, * http://example.com/hoge.ajpg and http://example.com/hoge.jpg?hoge * 1.3 in "\\.(gif|jpe?g|png)$" * NOT LOGGING http://example.com/hoge.gif and http://example.com/hoge.jpeg * LOGGING http://example.com/hoge.gif?uid=2 and http://example.com/hoge.jpg?pid=3 * 2. RegExp * 2.1 in /\.(gif|jpe?g|png)$/ * SAME AS 1.3 * 3. Array * 3.1 ["\\.jpg$", "\\.png", "\\.gif"] * SAME AS "\\.jpg|\\.png|\\.gif" */ function createNoLogCondition(nolog) { let regexp = null; if (nolog instanceof RegExp) { regexp = nolog; } if (typeof nolog === 'string') { regexp = new RegExp(nolog); } if (Array.isArray(nolog)) { // convert to strings const regexpsAsStrings = nolog.map((reg) => reg.source ? reg.source : reg ); regexp = new RegExp(regexpsAsStrings.join('|')); } return regexp; } /** * Allows users to define rules around status codes to assign them to a specific * logging level. * There are two types of rules: * - RANGE: matches a code within a certain range * E.g. { 'from': 200, 'to': 299, 'level': 'info' } * - CONTAINS: matches a code to a set of expected codes * E.g. { 'codes': [200, 203], 'level': 'debug' } * Note*: Rules are respected only in order of prescendence. * * @param {Number} statusCode * @param {Level} currentLevel * @param {Object} ruleSet * @return {Level} * @api private */ function matchRules(statusCode, currentLevel, ruleSet) { let level = currentLevel; if (ruleSet) { const matchedRule = ruleSet.find((rule) => { let ruleMatched = false; if (rule.from && rule.to) { ruleMatched = statusCode >= rule.from && statusCode <= rule.to; } else { ruleMatched = rule.codes.indexOf(statusCode) !== -1; } return ruleMatched; }); if (matchedRule) { level = levels.getLevel(matchedRule.level, level); } } return level; } /** * Log requests with the given `options` or a `format` string. * * Options: * * - `format` Format string, see below for tokens * - `level` A log4js levels instance. Supports also 'auto' * - `nolog` A string or RegExp to exclude target logs or function(req, res): boolean * - `statusRules` A array of rules for setting specific logging levels base on status codes * - `context` Whether to add a response of express to the context * * Tokens: * * - `:req[header]` ex: `:req[Accept]` * - `:res[header]` ex: `:res[Content-Length]` * - `:http-version` * - `:response-time` * - `:remote-addr` * - `:date` * - `:method` * - `:url` * - `:referrer` * - `:user-agent` * - `:status` * * @return {Function} * @param logger4js * @param options * @api public */ module.exports = function getLogger(logger4js, options) { if (typeof options === 'string' || typeof options === 'function') { options = { format: options }; } else { options = options || {}; } const thisLogger = logger4js; let level = levels.getLevel(options.level, levels.INFO); const fmt = options.format || DEFAULT_FORMAT; return (req, res, next) => { // mount safety if (req._logging !== undefined) return next(); // nologs if (typeof options.nolog !== 'function') { const nolog = createNoLogCondition(options.nolog); if (nolog && nolog.test(req.originalUrl)) return next(); } if (thisLogger.isLevelEnabled(level) || options.level === 'auto') { const start = new Date(); const { writeHead } = res; // flag as logging req._logging = true; // proxy for statusCode. res.writeHead = (code, headers) => { res.writeHead = writeHead; res.writeHead(code, headers); res.__statusCode = code; res.__headers = headers || {}; }; // hook on end request to emit the log entry of the HTTP request. let finished = false; const handler = () => { if (finished) { return; } finished = true; // nologs if (typeof options.nolog === 'function') { if (options.nolog(req, res) === true) { req._logging = false; return; } } res.responseTime = new Date() - start; // status code response level handling if (res.statusCode && options.level === 'auto') { level = levels.INFO; if (res.statusCode >= 300) level = levels.WARN; if (res.statusCode >= 400) level = levels.ERROR; } level = matchRules(res.statusCode, level, options.statusRules); const combinedTokens = assembleTokens(req, res, options.tokens || []); if (options.context) thisLogger.addContext('res', res); if (typeof fmt === 'function') { const line = fmt(req, res, (str) => format(str, combinedTokens)); if (line) thisLogger.log(level, line); } else { thisLogger.log(level, format(fmt, combinedTokens)); } if (options.context) thisLogger.removeContext('res'); }; res.on('end', handler); res.on('finish', handler); res.on('error', handler); res.on('close', handler); } // ensure next gets always called return next(); }; };
/* eslint no-underscore-dangle: ["error", { "allow": ["__statusCode", "_remoteAddress", "__headers", "_logging"] }] */ const levels = require('./levels'); const DEFAULT_FORMAT = ':remote-addr - -' + ' ":method :url HTTP/:http-version"' + ' :status :content-length ":referrer"' + ' ":user-agent"'; /** * Return request url path, * adding this function prevents the Cyclomatic Complexity, * for the assemble_tokens function at low, to pass the tests. * * @param {IncomingMessage} req * @return {string} * @api private */ function getUrl(req) { return req.originalUrl || req.url; } /** * Adds custom {token, replacement} objects to defaults, * overwriting the defaults if any tokens clash * * @param {IncomingMessage} req * @param {ServerResponse} res * @param {Array} customTokens * [{ token: string-or-regexp, replacement: string-or-replace-function }] * @return {Array} */ function assembleTokens(req, res, customTokens) { const arrayUniqueTokens = (array) => { const a = array.concat(); for (let i = 0; i < a.length; ++i) { for (let j = i + 1; j < a.length; ++j) { // not === because token can be regexp object // eslint-disable-next-line eqeqeq if (a[i].token == a[j].token) { a.splice(j--, 1); // eslint-disable-line no-plusplus } } } return a; }; const defaultTokens = []; defaultTokens.push({ token: ':url', replacement: getUrl(req) }); defaultTokens.push({ token: ':protocol', replacement: req.protocol }); defaultTokens.push({ token: ':hostname', replacement: req.hostname }); defaultTokens.push({ token: ':method', replacement: req.method }); defaultTokens.push({ token: ':status', replacement: res.__statusCode || res.statusCode, }); defaultTokens.push({ token: ':response-time', replacement: res.responseTime, }); defaultTokens.push({ token: ':date', replacement: new Date().toUTCString() }); defaultTokens.push({ token: ':referrer', replacement: req.headers.referer || req.headers.referrer || '', }); defaultTokens.push({ token: ':http-version', replacement: `${req.httpVersionMajor}.${req.httpVersionMinor}`, }); defaultTokens.push({ token: ':remote-addr', replacement: req.headers['x-forwarded-for'] || req.ip || req._remoteAddress || (req.socket && (req.socket.remoteAddress || (req.socket.socket && req.socket.socket.remoteAddress))), }); defaultTokens.push({ token: ':user-agent', replacement: req.headers['user-agent'], }); defaultTokens.push({ token: ':content-length', replacement: res.getHeader('content-length') || (res.__headers && res.__headers['Content-Length']) || '-', }); defaultTokens.push({ token: /:req\[([^\]]+)]/g, replacement(_, field) { return req.headers[field.toLowerCase()]; }, }); defaultTokens.push({ token: /:res\[([^\]]+)]/g, replacement(_, field) { return ( res.getHeader(field.toLowerCase()) || (res.__headers && res.__headers[field]) ); }, }); return arrayUniqueTokens(customTokens.concat(defaultTokens)); } /** * Return formatted log line. * * @param {string} str * @param {Array} tokens * @return {string} * @api private */ function format(str, tokens) { for (let i = 0; i < tokens.length; i++) { str = str.replace(tokens[i].token, tokens[i].replacement); } return str; } /** * Return RegExp Object about nolog * * @param {(string|Array)} nolog * @return {RegExp} * @api private * * syntax * 1. String * 1.1 "\\.gif" * NOT LOGGING http://example.com/hoge.gif and http://example.com/hoge.gif?fuga * LOGGING http://example.com/hoge.agif * 1.2 in "\\.gif|\\.jpg$" * NOT LOGGING http://example.com/hoge.gif and * http://example.com/hoge.gif?fuga and http://example.com/hoge.jpg?fuga * LOGGING http://example.com/hoge.agif, * http://example.com/hoge.ajpg and http://example.com/hoge.jpg?hoge * 1.3 in "\\.(gif|jpe?g|png)$" * NOT LOGGING http://example.com/hoge.gif and http://example.com/hoge.jpeg * LOGGING http://example.com/hoge.gif?uid=2 and http://example.com/hoge.jpg?pid=3 * 2. RegExp * 2.1 in /\.(gif|jpe?g|png)$/ * SAME AS 1.3 * 3. Array * 3.1 ["\\.jpg$", "\\.png", "\\.gif"] * SAME AS "\\.jpg|\\.png|\\.gif" */ function createNoLogCondition(nolog) { let regexp = null; if (nolog instanceof RegExp) { regexp = nolog; } if (typeof nolog === 'string') { regexp = new RegExp(nolog); } if (Array.isArray(nolog)) { // convert to strings const regexpsAsStrings = nolog.map((reg) => reg.source ? reg.source : reg ); regexp = new RegExp(regexpsAsStrings.join('|')); } return regexp; } /** * Allows users to define rules around status codes to assign them to a specific * logging level. * There are two types of rules: * - RANGE: matches a code within a certain range * E.g. { 'from': 200, 'to': 299, 'level': 'info' } * - CONTAINS: matches a code to a set of expected codes * E.g. { 'codes': [200, 203], 'level': 'debug' } * Note*: Rules are respected only in order of prescendence. * * @param {Number} statusCode * @param {Level} currentLevel * @param {Object} ruleSet * @return {Level} * @api private */ function matchRules(statusCode, currentLevel, ruleSet) { let level = currentLevel; if (ruleSet) { const matchedRule = ruleSet.find((rule) => { let ruleMatched = false; if (rule.from && rule.to) { ruleMatched = statusCode >= rule.from && statusCode <= rule.to; } else { ruleMatched = rule.codes.indexOf(statusCode) !== -1; } return ruleMatched; }); if (matchedRule) { level = levels.getLevel(matchedRule.level, level); } } return level; } /** * Log requests with the given `options` or a `format` string. * * Options: * * - `format` Format string, see below for tokens * - `level` A log4js levels instance. Supports also 'auto' * - `nolog` A string or RegExp to exclude target logs or function(req, res): boolean * - `statusRules` A array of rules for setting specific logging levels base on status codes * - `context` Whether to add a response of express to the context * * Tokens: * * - `:req[header]` ex: `:req[Accept]` * - `:res[header]` ex: `:res[Content-Length]` * - `:http-version` * - `:response-time` * - `:remote-addr` * - `:date` * - `:method` * - `:url` * - `:referrer` * - `:user-agent` * - `:status` * * @return {Function} * @param logger4js * @param options * @api public */ module.exports = function getLogger(logger4js, options) { if (typeof options === 'string' || typeof options === 'function') { options = { format: options }; } else { options = options || {}; } const thisLogger = logger4js; let level = levels.getLevel(options.level, levels.INFO); const fmt = options.format || DEFAULT_FORMAT; return (req, res, next) => { // mount safety if (typeof req._logging !== 'undefined') return next(); // nologs if (typeof options.nolog !== 'function') { const nolog = createNoLogCondition(options.nolog); if (nolog && nolog.test(req.originalUrl)) return next(); } if (thisLogger.isLevelEnabled(level) || options.level === 'auto') { const start = new Date(); const { writeHead } = res; // flag as logging req._logging = true; // proxy for statusCode. res.writeHead = (code, headers) => { res.writeHead = writeHead; res.writeHead(code, headers); res.__statusCode = code; res.__headers = headers || {}; }; // hook on end request to emit the log entry of the HTTP request. let finished = false; const handler = () => { if (finished) { return; } finished = true; // nologs if (typeof options.nolog === 'function') { if (options.nolog(req, res) === true) { req._logging = false; return; } } res.responseTime = new Date() - start; // status code response level handling if (res.statusCode && options.level === 'auto') { level = levels.INFO; if (res.statusCode >= 300) level = levels.WARN; if (res.statusCode >= 400) level = levels.ERROR; } level = matchRules(res.statusCode, level, options.statusRules); const combinedTokens = assembleTokens(req, res, options.tokens || []); if (options.context) thisLogger.addContext('res', res); if (typeof fmt === 'function') { const line = fmt(req, res, (str) => format(str, combinedTokens)); if (line) thisLogger.log(level, line); } else { thisLogger.log(level, format(fmt, combinedTokens)); } if (options.context) thisLogger.removeContext('res'); }; res.on('end', handler); res.on('finish', handler); res.on('error', handler); res.on('close', handler); } // ensure next gets always called return next(); }; };
1
log4js-node/log4js-node
1,332
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`
Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
lamweili
"2022-10-01T10:52:35Z"
"2022-10-01T10:57:05Z"
916eef11f1d2aa2f32765f956f1f674745feb8b6
570ef530dc02d3e843a5421cb015bb8fadfe0b41
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`. Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
./lib/logger.js
/* eslint no-underscore-dangle: ["error", { "allow": ["_log"] }] */ const debug = require('debug')('log4js:logger'); const LoggingEvent = require('./LoggingEvent'); const levels = require('./levels'); const clustering = require('./clustering'); const categories = require('./categories'); const configuration = require('./configuration'); const stackReg = /at (?:(.+)\s+\()?(?:(.+?):(\d+)(?::(\d+))?|([^)]+))\)?/; /** * The top entry is the Error */ const baseCallStackSkip = 1; /** * The _log function is 3 levels deep, we need to skip those to make it to the callSite */ const defaultErrorCallStackSkip = 3; /** * * @param {Error} data * @param {number} skipIdx * @returns {import('../types/log4js').CallStack | null} */ function defaultParseCallStack( data, skipIdx = defaultErrorCallStackSkip + baseCallStackSkip ) { try { const stacklines = data.stack.split('\n').slice(skipIdx); if (!stacklines.length) { // There's no stack in this stack // Should we try a previous index if skipIdx was set? return null; } const lineMatch = stackReg.exec(stacklines[0]); /* istanbul ignore else: failsafe */ if (lineMatch && lineMatch.length === 6) { // extract class, function and alias names let className = ''; let functionName = ''; let functionAlias = ''; if (lineMatch[1] && lineMatch[1] !== '') { // WARN: this will unset alias if alias is not present. [functionName, functionAlias] = lineMatch[1] .replace(/[[\]]/g, '') .split(' as '); functionAlias = functionAlias || ''; if (functionName.includes('.')) [className, functionName] = functionName.split('.'); } return { fileName: lineMatch[2], lineNumber: parseInt(lineMatch[3], 10), columnNumber: parseInt(lineMatch[4], 10), callStack: stacklines.join('\n'), className, functionName, functionAlias, callerName: lineMatch[1] || '', }; // eslint-disable-next-line no-else-return } else { // will never get here unless nodejs has changes to Error console.error('log4js.logger - defaultParseCallStack error'); // eslint-disable-line no-console } } catch (err) { // will never get error unless nodejs has breaking changes to Error console.error('log4js.logger - defaultParseCallStack error', err); // eslint-disable-line no-console } return null; } /** * Logger to log messages. * use {@see log4js#getLogger(String)} to get an instance. * * @name Logger * @namespace Log4js * @param name name of category to log to * @param level - the loglevel for the category * @param dispatch - the function which will receive the logevents * * @author Stephan Strittmatter */ class Logger { constructor(name) { if (!name) { throw new Error('No category provided.'); } this.category = name; this.context = {}; /** @private */ this.callStackSkipIndex = 0; /** @private */ this.parseCallStack = defaultParseCallStack; debug(`Logger created (${this.category}, ${this.level})`); } get level() { return levels.getLevel( categories.getLevelForCategory(this.category), levels.OFF ); } set level(level) { categories.setLevelForCategory( this.category, levels.getLevel(level, this.level) ); } get useCallStack() { return categories.getEnableCallStackForCategory(this.category); } set useCallStack(bool) { categories.setEnableCallStackForCategory(this.category, bool === true); } get callStackLinesToSkip() { return this.callStackSkipIndex; } set callStackLinesToSkip(number) { if (typeof number !== 'number') { throw new TypeError('Must be a number'); } if (number < 0) { throw new RangeError('Must be >= 0'); } this.callStackSkipIndex = number; } log(level, ...args) { const logLevel = levels.getLevel(level); if (!logLevel) { if (configuration.validIdentifier(level) && args.length > 0) { // logLevel not found but of valid signature, WARN before fallback to INFO this.log( levels.WARN, 'log4js:logger.log: valid log-level not found as first parameter given:', level ); this.log(levels.INFO, `[${level}]`, ...args); } else { // apart from fallback, allow .log(...args) to be synonym with .log("INFO", ...args) this.log(levels.INFO, level, ...args); } } else if (this.isLevelEnabled(logLevel)) { this._log(logLevel, args); } } isLevelEnabled(otherLevel) { return this.level.isLessThanOrEqualTo(otherLevel); } _log(level, data) { debug(`sending log data (${level}) to appenders`); const error = data.find((item) => item instanceof Error); let callStack; if (this.useCallStack) { try { if (error) { callStack = this.parseCallStack( error, this.callStackSkipIndex + baseCallStackSkip ); } } catch (_err) { // Ignore Error and use the original method of creating a new Error. } callStack = callStack || this.parseCallStack( new Error(), this.callStackSkipIndex + defaultErrorCallStackSkip + baseCallStackSkip ); } const loggingEvent = new LoggingEvent( this.category, level, data, this.context, callStack, error ); clustering.send(loggingEvent); } addContext(key, value) { this.context[key] = value; } removeContext(key) { delete this.context[key]; } clearContext() { this.context = {}; } setParseCallStackFunction(parseFunction) { if (typeof parseFunction === 'function') { this.parseCallStack = parseFunction; } else if (parseFunction === undefined) { this.parseCallStack = defaultParseCallStack; } else { throw new TypeError('Invalid type passed to setParseCallStackFunction'); } } } function addLevelMethods(target) { const level = levels.getLevel(target); const levelStrLower = level.toString().toLowerCase(); const levelMethod = levelStrLower.replace(/_([a-z])/g, (g) => g[1].toUpperCase() ); const isLevelMethod = levelMethod[0].toUpperCase() + levelMethod.slice(1); Logger.prototype[`is${isLevelMethod}Enabled`] = function () { return this.isLevelEnabled(level); }; Logger.prototype[levelMethod] = function (...args) { this.log(level, ...args); }; } levels.levels.forEach(addLevelMethods); configuration.addListener(() => { levels.levels.forEach(addLevelMethods); }); module.exports = Logger;
/* eslint no-underscore-dangle: ["error", { "allow": ["_log"] }] */ const debug = require('debug')('log4js:logger'); const LoggingEvent = require('./LoggingEvent'); const levels = require('./levels'); const clustering = require('./clustering'); const categories = require('./categories'); const configuration = require('./configuration'); const stackReg = /at (?:(.+)\s+\()?(?:(.+?):(\d+)(?::(\d+))?|([^)]+))\)?/; /** * The top entry is the Error */ const baseCallStackSkip = 1; /** * The _log function is 3 levels deep, we need to skip those to make it to the callSite */ const defaultErrorCallStackSkip = 3; /** * * @param {Error} data * @param {number} skipIdx * @returns {import('../types/log4js').CallStack | null} */ function defaultParseCallStack( data, skipIdx = defaultErrorCallStackSkip + baseCallStackSkip ) { try { const stacklines = data.stack.split('\n').slice(skipIdx); if (!stacklines.length) { // There's no stack in this stack // Should we try a previous index if skipIdx was set? return null; } const lineMatch = stackReg.exec(stacklines[0]); /* istanbul ignore else: failsafe */ if (lineMatch && lineMatch.length === 6) { // extract class, function and alias names let className = ''; let functionName = ''; let functionAlias = ''; if (lineMatch[1] && lineMatch[1] !== '') { // WARN: this will unset alias if alias is not present. [functionName, functionAlias] = lineMatch[1] .replace(/[[\]]/g, '') .split(' as '); functionAlias = functionAlias || ''; if (functionName.includes('.')) [className, functionName] = functionName.split('.'); } return { fileName: lineMatch[2], lineNumber: parseInt(lineMatch[3], 10), columnNumber: parseInt(lineMatch[4], 10), callStack: stacklines.join('\n'), className, functionName, functionAlias, callerName: lineMatch[1] || '', }; // eslint-disable-next-line no-else-return } else { // will never get here unless nodejs has changes to Error console.error('log4js.logger - defaultParseCallStack error'); // eslint-disable-line no-console } } catch (err) { // will never get error unless nodejs has breaking changes to Error console.error('log4js.logger - defaultParseCallStack error', err); // eslint-disable-line no-console } return null; } /** * Logger to log messages. * use {@see log4js#getLogger(String)} to get an instance. * * @name Logger * @namespace Log4js * @param name name of category to log to * @param level - the loglevel for the category * @param dispatch - the function which will receive the logevents * * @author Stephan Strittmatter */ class Logger { constructor(name) { if (!name) { throw new Error('No category provided.'); } this.category = name; this.context = {}; /** @private */ this.callStackSkipIndex = 0; /** @private */ this.parseCallStack = defaultParseCallStack; debug(`Logger created (${this.category}, ${this.level})`); } get level() { return levels.getLevel( categories.getLevelForCategory(this.category), levels.OFF ); } set level(level) { categories.setLevelForCategory( this.category, levels.getLevel(level, this.level) ); } get useCallStack() { return categories.getEnableCallStackForCategory(this.category); } set useCallStack(bool) { categories.setEnableCallStackForCategory(this.category, bool === true); } get callStackLinesToSkip() { return this.callStackSkipIndex; } set callStackLinesToSkip(number) { if (typeof number !== 'number') { throw new TypeError('Must be a number'); } if (number < 0) { throw new RangeError('Must be >= 0'); } this.callStackSkipIndex = number; } log(level, ...args) { const logLevel = levels.getLevel(level); if (!logLevel) { if (configuration.validIdentifier(level) && args.length > 0) { // logLevel not found but of valid signature, WARN before fallback to INFO this.log( levels.WARN, 'log4js:logger.log: valid log-level not found as first parameter given:', level ); this.log(levels.INFO, `[${level}]`, ...args); } else { // apart from fallback, allow .log(...args) to be synonym with .log("INFO", ...args) this.log(levels.INFO, level, ...args); } } else if (this.isLevelEnabled(logLevel)) { this._log(logLevel, args); } } isLevelEnabled(otherLevel) { return this.level.isLessThanOrEqualTo(otherLevel); } _log(level, data) { debug(`sending log data (${level}) to appenders`); const error = data.find((item) => item instanceof Error); let callStack; if (this.useCallStack) { try { if (error) { callStack = this.parseCallStack( error, this.callStackSkipIndex + baseCallStackSkip ); } } catch (_err) { // Ignore Error and use the original method of creating a new Error. } callStack = callStack || this.parseCallStack( new Error(), this.callStackSkipIndex + defaultErrorCallStackSkip + baseCallStackSkip ); } const loggingEvent = new LoggingEvent( this.category, level, data, this.context, callStack, error ); clustering.send(loggingEvent); } addContext(key, value) { this.context[key] = value; } removeContext(key) { delete this.context[key]; } clearContext() { this.context = {}; } setParseCallStackFunction(parseFunction) { if (typeof parseFunction === 'function') { this.parseCallStack = parseFunction; } else if (typeof parseFunction === 'undefined') { this.parseCallStack = defaultParseCallStack; } else { throw new TypeError('Invalid type passed to setParseCallStackFunction'); } } } function addLevelMethods(target) { const level = levels.getLevel(target); const levelStrLower = level.toString().toLowerCase(); const levelMethod = levelStrLower.replace(/_([a-z])/g, (g) => g[1].toUpperCase() ); const isLevelMethod = levelMethod[0].toUpperCase() + levelMethod.slice(1); Logger.prototype[`is${isLevelMethod}Enabled`] = function () { return this.isLevelEnabled(level); }; Logger.prototype[levelMethod] = function (...args) { this.log(level, ...args); }; } levels.levels.forEach(addLevelMethods); configuration.addListener(() => { levels.levels.forEach(addLevelMethods); }); module.exports = Logger;
1
log4js-node/log4js-node
1,332
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`
Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
lamweili
"2022-10-01T10:52:35Z"
"2022-10-01T10:57:05Z"
916eef11f1d2aa2f32765f956f1f674745feb8b6
570ef530dc02d3e843a5421cb015bb8fadfe0b41
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`. Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
./test/tap/LoggingEvent-test.js
const flatted = require('flatted'); const { test } = require('tap'); const LoggingEvent = require('../../lib/LoggingEvent'); const levels = require('../../lib/levels'); test('LoggingEvent', (batch) => { batch.test('should serialise to flatted', (t) => { const event = new LoggingEvent( 'cheese', levels.DEBUG, ['log message', parseInt('abc', 10), 1 / 0, -1 / 0, undefined], { user: 'bob', } ); // set the event date to a known value event.startTime = new Date(Date.UTC(2018, 1, 4, 18, 30, 23, 10)); const rehydratedEvent = flatted.parse(event.serialise()); t.equal(rehydratedEvent.startTime, '2018-02-04T18:30:23.010Z'); t.equal(rehydratedEvent.categoryName, 'cheese'); t.equal(rehydratedEvent.level.levelStr, 'DEBUG'); t.equal(rehydratedEvent.data.length, 5); t.equal(rehydratedEvent.data[0], 'log message'); t.equal(rehydratedEvent.data[1], 'NaN'); t.equal(rehydratedEvent.data[2], 'Infinity'); t.equal(rehydratedEvent.data[3], '-Infinity'); t.equal(rehydratedEvent.data[4], 'undefined'); t.equal(rehydratedEvent.context.user, 'bob'); t.end(); }); batch.test('should deserialise from flatted', (t) => { const dehydratedEvent = flatted.stringify({ startTime: '2018-02-04T10:25:23.010Z', categoryName: 'biscuits', level: { levelStr: 'INFO', }, data: ['some log message', { x: 1 }], context: { thing: 'otherThing' }, pid: '1234', functionName: 'bound', fileName: 'domain.js', lineNumber: 421, columnNumber: 15, callStack: 'at bound (domain.js:421:15)\n', }); const event = LoggingEvent.deserialise(dehydratedEvent); t.type(event, LoggingEvent); t.same(event.startTime, new Date(Date.UTC(2018, 1, 4, 10, 25, 23, 10))); t.equal(event.categoryName, 'biscuits'); t.same(event.level, levels.INFO); t.equal(event.data[0], 'some log message'); t.equal(event.data[1].x, 1); t.equal(event.context.thing, 'otherThing'); t.equal(event.pid, '1234'); t.equal(event.functionName, 'bound'); t.equal(event.fileName, 'domain.js'); t.equal(event.lineNumber, 421); t.equal(event.columnNumber, 15); t.equal(event.callStack, 'at bound (domain.js:421:15)\n'); t.end(); }); batch.test('Should correct construct with/without location info', (t) => { // console.log([Error('123').stack.split('\n').slice(1).join('\n')]) const callStack = ' at repl:1:14\n at ContextifyScript.Script.runInThisContext (vm.js:50:33)\n at REPLServer.defaultEval (repl.js:240:29)\n at bound (domain.js:301:14)\n at REPLServer.runBound [as eval] (domain.js:314:12)\n at REPLServer.onLine (repl.js:468:10)\n at emitOne (events.js:121:20)\n at REPLServer.emit (events.js:211:7)\n at REPLServer.Interface._onLine (readline.js:280:10)\n at REPLServer.Interface._line (readline.js:629:8)'; // eslint-disable-line max-len const fileName = '/log4js-node/test/tap/layouts-test.js'; const lineNumber = 1; const columnNumber = 14; const className = ''; const functionName = ''; const functionAlias = ''; const callerName = ''; const location = { fileName, lineNumber, columnNumber, callStack, className, functionName, functionAlias, callerName, }; const event = new LoggingEvent( 'cheese', levels.DEBUG, ['log message'], { user: 'bob' }, location ); t.equal(event.fileName, fileName); t.equal(event.lineNumber, lineNumber); t.equal(event.columnNumber, columnNumber); t.equal(event.callStack, callStack); t.equal(event.className, className); t.equal(event.functionName, functionName); t.equal(event.functionAlias, functionAlias); t.equal(event.callerName, callerName); const event2 = new LoggingEvent('cheese', levels.DEBUG, ['log message'], { user: 'bob', }); t.equal(event2.fileName, undefined); t.equal(event2.lineNumber, undefined); t.equal(event2.columnNumber, undefined); t.equal(event2.callStack, undefined); t.equal(event2.className, undefined); t.equal(event2.functionName, undefined); t.equal(event2.functionAlias, undefined); t.equal(event2.callerName, undefined); t.end(); }); batch.test('Should contain class, method and alias names', (t) => { // console.log([Error('123').stack.split('\n').slice(1).join('\n')]) const callStack = ' at Foo.bar [as baz] (repl:1:14)\n at ContextifyScript.Script.runInThisContext (vm.js:50:33)\n at REPLServer.defaultEval (repl.js:240:29)\n at bound (domain.js:301:14)\n at REPLServer.runBound [as eval] (domain.js:314:12)\n at REPLServer.onLine (repl.js:468:10)\n at emitOne (events.js:121:20)\n at REPLServer.emit (events.js:211:7)\n at REPLServer.Interface._onLine (readline.js:280:10)\n at REPLServer.Interface._line (readline.js:629:8)'; // eslint-disable-line max-len const fileName = '/log4js-node/test/tap/layouts-test.js'; const lineNumber = 1; const columnNumber = 14; const className = 'Foo'; const functionName = 'bar'; const functionAlias = 'baz'; const callerName = 'Foo.bar [as baz]'; const location = { fileName, lineNumber, columnNumber, callStack, className, functionName, functionAlias, callerName, }; const event = new LoggingEvent( 'cheese', levels.DEBUG, ['log message'], { user: 'bob' }, location ); t.equal(event.fileName, fileName); t.equal(event.lineNumber, lineNumber); t.equal(event.columnNumber, columnNumber); t.equal(event.callStack, callStack); t.equal(event.className, className); t.equal(event.functionName, functionName); t.equal(event.functionAlias, functionAlias); t.equal(event.callerName, callerName); t.end(); }); batch.test('Should correctly serialize and deserialize', (t) => { const error = new Error('test'); const location = { fileName: __filename, lineNumber: 123, columnNumber: 52, callStack: error.stack, className: 'Foo', functionName: 'test', functionAlias: 'baz', callerName: 'Foo.test [as baz]', }; const event = new LoggingEvent( 'cheese', levels.DEBUG, [error, 'log message'], { user: 'bob', }, location, error ); const event2 = LoggingEvent.deserialise(event.serialise()); t.match(event2, event); t.end(); }); batch.end(); });
const flatted = require('flatted'); const { test } = require('tap'); const LoggingEvent = require('../../lib/LoggingEvent'); const levels = require('../../lib/levels'); test('LoggingEvent', (batch) => { batch.test('should serialise to flatted', (t) => { const event = new LoggingEvent( 'cheese', levels.DEBUG, [ 'log message', Number('abc'), 'NaN', 1 / 0, 'Infinity', -1 / 0, '-Infinity', undefined, 'undefined', ], { user: 'bob', } ); // set the event date to a known value event.startTime = new Date(Date.UTC(2018, 1, 4, 18, 30, 23, 10)); const rehydratedEvent = flatted.parse(event.serialise()); t.equal(rehydratedEvent.startTime, '2018-02-04T18:30:23.010Z'); t.equal(rehydratedEvent.categoryName, 'cheese'); t.equal(rehydratedEvent.level.levelStr, 'DEBUG'); t.equal(rehydratedEvent.data.length, 9); t.equal(rehydratedEvent.data[0], 'log message'); t.equal(rehydratedEvent.data[1], '__LOG4JS_NaN__'); t.equal(rehydratedEvent.data[2], 'NaN'); t.equal(rehydratedEvent.data[3], '__LOG4JS_Infinity__'); t.equal(rehydratedEvent.data[4], 'Infinity'); t.equal(rehydratedEvent.data[5], '__LOG4JS_-Infinity__'); t.equal(rehydratedEvent.data[6], '-Infinity'); t.equal(rehydratedEvent.data[7], '__LOG4JS_undefined__'); t.equal(rehydratedEvent.data[8], 'undefined'); t.equal(rehydratedEvent.context.user, 'bob'); t.end(); }); batch.test('should deserialise from flatted', (t) => { const dehydratedEvent = flatted.stringify({ startTime: '2018-02-04T10:25:23.010Z', categoryName: 'biscuits', level: { levelStr: 'INFO', }, data: [ 'some log message', { x: 1 }, '__LOG4JS_NaN__', 'NaN', '__LOG4JS_Infinity__', 'Infinity', '__LOG4JS_-Infinity__', '-Infinity', '__LOG4JS_undefined__', 'undefined', ], context: { thing: 'otherThing' }, pid: '1234', functionName: 'bound', fileName: 'domain.js', lineNumber: 421, columnNumber: 15, callStack: 'at bound (domain.js:421:15)\n', }); const event = LoggingEvent.deserialise(dehydratedEvent); t.type(event, LoggingEvent); t.same(event.startTime, new Date(Date.UTC(2018, 1, 4, 10, 25, 23, 10))); t.equal(event.categoryName, 'biscuits'); t.same(event.level, levels.INFO); t.equal(event.data.length, 10); t.equal(event.data[0], 'some log message'); t.equal(event.data[1].x, 1); t.ok(Number.isNaN(event.data[2])); t.equal(event.data[3], 'NaN'); t.equal(event.data[4], 1 / 0); t.equal(event.data[5], 'Infinity'); t.equal(event.data[6], -1 / 0); t.equal(event.data[7], '-Infinity'); t.equal(event.data[8], undefined); t.equal(event.data[9], 'undefined'); t.equal(event.context.thing, 'otherThing'); t.equal(event.pid, '1234'); t.equal(event.functionName, 'bound'); t.equal(event.fileName, 'domain.js'); t.equal(event.lineNumber, 421); t.equal(event.columnNumber, 15); t.equal(event.callStack, 'at bound (domain.js:421:15)\n'); t.end(); }); batch.test('Should correct construct with/without location info', (t) => { // console.log([Error('123').stack.split('\n').slice(1).join('\n')]) const callStack = ' at repl:1:14\n at ContextifyScript.Script.runInThisContext (vm.js:50:33)\n at REPLServer.defaultEval (repl.js:240:29)\n at bound (domain.js:301:14)\n at REPLServer.runBound [as eval] (domain.js:314:12)\n at REPLServer.onLine (repl.js:468:10)\n at emitOne (events.js:121:20)\n at REPLServer.emit (events.js:211:7)\n at REPLServer.Interface._onLine (readline.js:280:10)\n at REPLServer.Interface._line (readline.js:629:8)'; // eslint-disable-line max-len const fileName = '/log4js-node/test/tap/layouts-test.js'; const lineNumber = 1; const columnNumber = 14; const className = ''; const functionName = ''; const functionAlias = ''; const callerName = ''; const location = { fileName, lineNumber, columnNumber, callStack, className, functionName, functionAlias, callerName, }; const event = new LoggingEvent( 'cheese', levels.DEBUG, ['log message'], { user: 'bob' }, location ); t.equal(event.fileName, fileName); t.equal(event.lineNumber, lineNumber); t.equal(event.columnNumber, columnNumber); t.equal(event.callStack, callStack); t.equal(event.className, className); t.equal(event.functionName, functionName); t.equal(event.functionAlias, functionAlias); t.equal(event.callerName, callerName); const event2 = new LoggingEvent('cheese', levels.DEBUG, ['log message'], { user: 'bob', }); t.equal(event2.fileName, undefined); t.equal(event2.lineNumber, undefined); t.equal(event2.columnNumber, undefined); t.equal(event2.callStack, undefined); t.equal(event2.className, undefined); t.equal(event2.functionName, undefined); t.equal(event2.functionAlias, undefined); t.equal(event2.callerName, undefined); t.end(); }); batch.test('Should contain class, method and alias names', (t) => { // console.log([Error('123').stack.split('\n').slice(1).join('\n')]) const callStack = ' at Foo.bar [as baz] (repl:1:14)\n at ContextifyScript.Script.runInThisContext (vm.js:50:33)\n at REPLServer.defaultEval (repl.js:240:29)\n at bound (domain.js:301:14)\n at REPLServer.runBound [as eval] (domain.js:314:12)\n at REPLServer.onLine (repl.js:468:10)\n at emitOne (events.js:121:20)\n at REPLServer.emit (events.js:211:7)\n at REPLServer.Interface._onLine (readline.js:280:10)\n at REPLServer.Interface._line (readline.js:629:8)'; // eslint-disable-line max-len const fileName = '/log4js-node/test/tap/layouts-test.js'; const lineNumber = 1; const columnNumber = 14; const className = 'Foo'; const functionName = 'bar'; const functionAlias = 'baz'; const callerName = 'Foo.bar [as baz]'; const location = { fileName, lineNumber, columnNumber, callStack, className, functionName, functionAlias, callerName, }; const event = new LoggingEvent( 'cheese', levels.DEBUG, ['log message'], { user: 'bob' }, location ); t.equal(event.fileName, fileName); t.equal(event.lineNumber, lineNumber); t.equal(event.columnNumber, columnNumber); t.equal(event.callStack, callStack); t.equal(event.className, className); t.equal(event.functionName, functionName); t.equal(event.functionAlias, functionAlias); t.equal(event.callerName, callerName); t.end(); }); batch.test('Should correctly serialize and deserialize', (t) => { const error = new Error('test'); const location = { fileName: __filename, lineNumber: 123, columnNumber: 52, callStack: error.stack, className: 'Foo', functionName: 'test', functionAlias: 'baz', callerName: 'Foo.test [as baz]', }; const event = new LoggingEvent( 'cheese', levels.DEBUG, [ error, 'log message', Number('abc'), 'NaN', 1 / 0, 'Infinity', -1 / 0, '-Infinity', undefined, 'undefined', ], { user: 'bob', }, location, error ); const event2 = LoggingEvent.deserialise(event.serialise()); t.match(event2, event); t.end(); }); batch.end(); });
1
log4js-node/log4js-node
1,332
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`
Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
lamweili
"2022-10-01T10:52:35Z"
"2022-10-01T10:57:05Z"
916eef11f1d2aa2f32765f956f1f674745feb8b6
570ef530dc02d3e843a5421cb015bb8fadfe0b41
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`. Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
./test/tap/logLevelFilter-test.js
const { test } = require('tap'); const fs = require('fs'); const os = require('os'); const EOL = os.EOL || '\n'; const osDelay = process.platform === 'win32' ? 400 : 200; function remove(filename) { try { fs.unlinkSync(filename); } catch (e) { // doesn't really matter if it failed } } test('log4js logLevelFilter', (batch) => { batch.test('appender', (t) => { const log4js = require('../../lib/log4js'); const recording = require('../../lib/appenders/recording'); log4js.configure({ appenders: { recorder: { type: 'recording' }, filtered: { type: 'logLevelFilter', appender: 'recorder', level: 'ERROR', }, }, categories: { default: { appenders: ['filtered'], level: 'debug' }, }, }); const logger = log4js.getLogger('logLevelTest'); logger.debug('this should not trigger an event'); logger.warn('neither should this'); logger.error('this should, though'); logger.fatal('so should this'); const logEvents = recording.replay(); t.test( 'should only pass log events greater than or equal to its own level', (assert) => { assert.equal(logEvents.length, 2); assert.equal(logEvents[0].data[0], 'this should, though'); assert.equal(logEvents[1].data[0], 'so should this'); assert.end(); } ); t.end(); }); batch.test('configure', (t) => { const log4js = require('../../lib/log4js'); remove(`${__dirname}/logLevelFilter.log`); remove(`${__dirname}/logLevelFilter-warnings.log`); remove(`${__dirname}/logLevelFilter-debugs.log`); t.teardown(() => { remove(`${__dirname}/logLevelFilter.log`); remove(`${__dirname}/logLevelFilter-warnings.log`); remove(`${__dirname}/logLevelFilter-debugs.log`); }); log4js.configure({ appenders: { 'warning-file': { type: 'file', filename: 'test/tap/logLevelFilter-warnings.log', layout: { type: 'messagePassThrough' }, }, warnings: { type: 'logLevelFilter', level: 'WARN', appender: 'warning-file', }, 'debug-file': { type: 'file', filename: 'test/tap/logLevelFilter-debugs.log', layout: { type: 'messagePassThrough' }, }, debugs: { type: 'logLevelFilter', level: 'TRACE', maxLevel: 'DEBUG', appender: 'debug-file', }, tests: { type: 'file', filename: 'test/tap/logLevelFilter.log', layout: { type: 'messagePassThrough', }, }, }, categories: { default: { appenders: ['tests', 'warnings', 'debugs'], level: 'trace' }, }, }); const logger = log4js.getLogger('tests'); logger.debug('debug'); logger.info('info'); logger.error('error'); logger.warn('warn'); logger.debug('debug'); logger.trace('trace'); // wait for the file system to catch up setTimeout(() => { t.test('tmp-tests.log should contain all log messages', (assert) => { fs.readFile( `${__dirname}/logLevelFilter.log`, 'utf8', (err, contents) => { const messages = contents.trim().split(EOL); assert.same(messages, [ 'debug', 'info', 'error', 'warn', 'debug', 'trace', ]); assert.end(); } ); }); t.test( 'tmp-tests-warnings.log should contain only error and warning logs', (assert) => { fs.readFile( `${__dirname}/logLevelFilter-warnings.log`, 'utf8', (err, contents) => { const messages = contents.trim().split(EOL); assert.same(messages, ['error', 'warn']); assert.end(); } ); } ); t.test( 'tmp-tests-debugs.log should contain only trace and debug logs', (assert) => { fs.readFile( `${__dirname}/logLevelFilter-debugs.log`, 'utf8', (err, contents) => { const messages = contents.trim().split(EOL); assert.same(messages, ['debug', 'debug', 'trace']); assert.end(); } ); } ); t.end(); }, osDelay); }); batch.end(); });
const { test } = require('tap'); const fs = require('fs'); const os = require('os'); const EOL = os.EOL || '\n'; const osDelay = process.platform === 'win32' ? 400 : 200; function remove(filename) { try { fs.unlinkSync(filename); } catch (e) { // doesn't really matter if it failed } } test('log4js logLevelFilter', (batch) => { batch.test('appender', (t) => { const log4js = require('../../lib/log4js'); const recording = require('../../lib/appenders/recording'); log4js.configure({ appenders: { recorder: { type: 'recording' }, filtered: { type: 'logLevelFilter', appender: 'recorder', level: 'ERROR', }, }, categories: { default: { appenders: ['filtered'], level: 'debug' }, }, }); const logger = log4js.getLogger('logLevelTest'); logger.debug('this should not trigger an event'); logger.warn('neither should this'); logger.error('this should, though'); logger.fatal('so should this'); const logEvents = recording.replay(); t.test( 'should only pass log events greater than or equal to its own level', (assert) => { assert.equal(logEvents.length, 2); assert.equal(logEvents[0].data[0], 'this should, though'); assert.equal(logEvents[1].data[0], 'so should this'); assert.end(); } ); t.end(); }); batch.test('configure', (t) => { const log4js = require('../../lib/log4js'); remove(`${__dirname}/logLevelFilter.log`); remove(`${__dirname}/logLevelFilter-warnings.log`); remove(`${__dirname}/logLevelFilter-debugs.log`); t.teardown(() => { remove(`${__dirname}/logLevelFilter.log`); remove(`${__dirname}/logLevelFilter-warnings.log`); remove(`${__dirname}/logLevelFilter-debugs.log`); }); log4js.configure({ appenders: { 'warning-file': { type: 'file', filename: 'test/tap/logLevelFilter-warnings.log', layout: { type: 'messagePassThrough' }, }, warnings: { type: 'logLevelFilter', level: 'WARN', appender: 'warning-file', }, 'debug-file': { type: 'file', filename: 'test/tap/logLevelFilter-debugs.log', layout: { type: 'messagePassThrough' }, }, debugs: { type: 'logLevelFilter', level: 'TRACE', maxLevel: 'DEBUG', appender: 'debug-file', }, tests: { type: 'file', filename: 'test/tap/logLevelFilter.log', layout: { type: 'messagePassThrough', }, }, }, categories: { default: { appenders: ['tests', 'warnings', 'debugs'], level: 'trace' }, }, }); const logger = log4js.getLogger('tests'); logger.debug('debug'); logger.info('info'); logger.error('error'); logger.warn('warn'); logger.debug('debug'); logger.trace('trace'); // wait for the file system to catch up setTimeout(() => { t.test('logLevelFilter.log should contain all log messages', (assert) => { fs.readFile( `${__dirname}/logLevelFilter.log`, 'utf8', (err, contents) => { const messages = contents.trim().split(EOL); assert.same(messages, [ 'debug', 'info', 'error', 'warn', 'debug', 'trace', ]); assert.end(); } ); }); t.test( 'logLevelFilter-warnings.log should contain only error and warning logs', (assert) => { fs.readFile( `${__dirname}/logLevelFilter-warnings.log`, 'utf8', (err, contents) => { const messages = contents.trim().split(EOL); assert.same(messages, ['error', 'warn']); assert.end(); } ); } ); t.test( 'logLevelFilter-debugs.log should contain only trace and debug logs', (assert) => { fs.readFile( `${__dirname}/logLevelFilter-debugs.log`, 'utf8', (err, contents) => { const messages = contents.trim().split(EOL); assert.same(messages, ['debug', 'debug', 'trace']); assert.end(); } ); } ); t.end(); }, osDelay); }); batch.end(); });
1
log4js-node/log4js-node
1,332
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`
Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
lamweili
"2022-10-01T10:52:35Z"
"2022-10-01T10:57:05Z"
916eef11f1d2aa2f32765f956f1f674745feb8b6
570ef530dc02d3e843a5421cb015bb8fadfe0b41
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`. Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
./examples/logFaces-appender.js
const log4js = require('../lib/log4js'); /* logFaces server configured with UDP receiver, using JSON format, listening on port 55201 will receive the logs from the appender below. */ log4js.configure({ appenders: { logFaces: { type: '@log4js-node/logfaces-udp', // (mandatory) appender type application: 'MY-NODEJS', // (optional) name of the application (domain) remoteHost: 'localhost', // (optional) logFaces server host or IP address port: 55201, // (optional) logFaces UDP receiver port (must use JSON format) layout: { // (optional) the layout to use for messages type: 'pattern', pattern: '%m', }, }, }, categories: { default: { appenders: ['logFaces'], level: 'info' } }, }); const logger = log4js.getLogger('myLogger'); logger.info('Testing message %s', 'arg1');
const log4js = require('../lib/log4js'); /* logFaces server configured with UDP receiver, using JSON format, listening on port 55201 will receive the logs from the appender below. */ log4js.configure({ appenders: { logFaces: { type: '@log4js-node/logfaces-udp', // (mandatory) appender type application: 'MY-NODEJS', // (optional) name of the application (domain) remoteHost: 'localhost', // (optional) logFaces server host or IP address port: 55201, // (optional) logFaces UDP receiver port (must use JSON format) layout: { // (optional) the layout to use for messages type: 'pattern', pattern: '%m', }, }, }, categories: { default: { appenders: ['logFaces'], level: 'info' } }, }); const logger = log4js.getLogger('myLogger'); logger.info('Testing message %s', 'arg1');
-1
log4js-node/log4js-node
1,332
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`
Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
lamweili
"2022-10-01T10:52:35Z"
"2022-10-01T10:57:05Z"
916eef11f1d2aa2f32765f956f1f674745feb8b6
570ef530dc02d3e843a5421cb015bb8fadfe0b41
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`. Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
./examples/flush-on-exit.js
/** * run this, then "ab -c 10 -n 100 localhost:4444/" to test (in * another shell) */ const log4js = require('../lib/log4js'); log4js.configure({ appenders: { cheese: { type: 'file', filename: 'cheese.log' }, }, categories: { default: { appenders: ['cheese'], level: 'debug' }, }, }); const logger = log4js.getLogger('cheese'); const http = require('http'); http .createServer((request, response) => { response.writeHead(200, { 'Content-Type': 'text/plain' }); const rd = Math.random() * 50; logger.info(`hello ${rd}`); response.write('hello '); if (Math.floor(rd) === 30) { log4js.shutdown(() => { process.exit(1); }); } response.end(); }) .listen(4444);
/** * run this, then "ab -c 10 -n 100 localhost:4444/" to test (in * another shell) */ const log4js = require('../lib/log4js'); log4js.configure({ appenders: { cheese: { type: 'file', filename: 'cheese.log' }, }, categories: { default: { appenders: ['cheese'], level: 'debug' }, }, }); const logger = log4js.getLogger('cheese'); const http = require('http'); http .createServer((request, response) => { response.writeHead(200, { 'Content-Type': 'text/plain' }); const rd = Math.random() * 50; logger.info(`hello ${rd}`); response.write('hello '); if (Math.floor(rd) === 30) { log4js.shutdown(() => { process.exit(1); }); } response.end(); }) .listen(4444);
-1
log4js-node/log4js-node
1,332
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`
Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
lamweili
"2022-10-01T10:52:35Z"
"2022-10-01T10:57:05Z"
916eef11f1d2aa2f32765f956f1f674745feb8b6
570ef530dc02d3e843a5421cb015bb8fadfe0b41
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`. Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
./test/tap/server-test.js
const { test } = require('tap'); const net = require('net'); const log4js = require('../../lib/log4js'); const vcr = require('../../lib/appenders/recording'); const levels = require('../../lib/levels'); const LoggingEvent = require('../../lib/LoggingEvent'); test('TCP Server', (batch) => { batch.test( 'should listen for TCP messages and re-send via process.send', (t) => { log4js.configure({ appenders: { vcr: { type: 'recording' }, tcp: { type: 'tcp-server', port: 5678 }, }, categories: { default: { appenders: ['vcr'], level: 'debug' }, }, }); // give the socket a chance to start up setTimeout(() => { const socket = net.connect(5678, () => { socket.write( `${new LoggingEvent( 'test-category', levels.INFO, ['something'], {} ).serialise()}__LOG4JS__${new LoggingEvent( 'test-category', levels.INFO, ['something else'], {} ).serialise()}__LOG4JS__some nonsense__LOG4JS__{"some":"json"}__LOG4JS__`, () => { socket.end(); setTimeout(() => { log4js.shutdown(() => { const logs = vcr.replay(); t.equal(logs.length, 4); t.match(logs[0], { data: ['something'], categoryName: 'test-category', level: { levelStr: 'INFO' }, context: {}, }); t.match(logs[1], { data: ['something else'], categoryName: 'test-category', level: { levelStr: 'INFO' }, context: {}, }); t.match(logs[2], { data: [ 'Unable to parse log:', 'some nonsense', 'because: ', SyntaxError, ], categoryName: 'log4js', level: { levelStr: 'ERROR' }, context: {}, }); t.match(logs[3], { data: [ 'Unable to parse log:', '{"some":"json"}', 'because: ', TypeError, ], categoryName: 'log4js', level: { levelStr: 'ERROR' }, context: {}, }); t.end(); }); }, 100); } ); }); socket.unref(); }, 100); } ); batch.test('sending incomplete messages in chunks', (t) => { log4js.configure({ appenders: { vcr: { type: 'recording' }, tcp: { type: 'tcp-server' }, }, categories: { default: { appenders: ['vcr'], level: 'debug' }, }, }); // give the socket a chance to start up setTimeout(() => { const socket = net.connect(5000, () => { const syncWrite = (dataArray, finalCallback) => { if (!Array.isArray(dataArray)) { dataArray = [dataArray]; } if (typeof finalCallback !== 'function') { finalCallback = () => {}; } setTimeout(() => { if (!dataArray.length) { finalCallback(); } else if (dataArray.length === 1) { socket.write(dataArray.shift(), finalCallback); } else { socket.write(dataArray.shift(), () => { syncWrite(dataArray, finalCallback); }); } }, 100); }; const dataArray = [ '__LOG4JS__', 'Hello__LOG4JS__World', '__LOG4JS__', 'testing nonsense', `__LOG4JS__more nonsense__LOG4JS__`, ]; const finalCallback = () => { socket.end(); setTimeout(() => { log4js.shutdown(() => { const logs = vcr.replay(); t.equal(logs.length, 8); t.match(logs[4], { data: [ 'Unable to parse log:', 'Hello', 'because: ', SyntaxError, ], categoryName: 'log4js', level: { levelStr: 'ERROR' }, context: {}, }); t.match(logs[5], { data: [ 'Unable to parse log:', 'World', 'because: ', SyntaxError, ], categoryName: 'log4js', level: { levelStr: 'ERROR' }, context: {}, }); t.match(logs[6], { data: [ 'Unable to parse log:', 'testing nonsense', 'because: ', SyntaxError, ], categoryName: 'log4js', level: { levelStr: 'ERROR' }, context: {}, }); t.match(logs[7], { data: [ 'Unable to parse log:', 'more nonsense', 'because: ', SyntaxError, ], categoryName: 'log4js', level: { levelStr: 'ERROR' }, context: {}, }); t.end(); }); }, 100); }; syncWrite(dataArray, finalCallback); }); socket.unref(); }, 100); }); batch.end(); });
const { test } = require('tap'); const net = require('net'); const log4js = require('../../lib/log4js'); const vcr = require('../../lib/appenders/recording'); const levels = require('../../lib/levels'); const LoggingEvent = require('../../lib/LoggingEvent'); test('TCP Server', (batch) => { batch.test( 'should listen for TCP messages and re-send via process.send', (t) => { log4js.configure({ appenders: { vcr: { type: 'recording' }, tcp: { type: 'tcp-server', port: 5678 }, }, categories: { default: { appenders: ['vcr'], level: 'debug' }, }, }); // give the socket a chance to start up setTimeout(() => { const socket = net.connect(5678, () => { socket.write( `${new LoggingEvent( 'test-category', levels.INFO, ['something'], {} ).serialise()}__LOG4JS__${new LoggingEvent( 'test-category', levels.INFO, ['something else'], {} ).serialise()}__LOG4JS__some nonsense__LOG4JS__{"some":"json"}__LOG4JS__`, () => { socket.end(); setTimeout(() => { log4js.shutdown(() => { const logs = vcr.replay(); t.equal(logs.length, 4); t.match(logs[0], { data: ['something'], categoryName: 'test-category', level: { levelStr: 'INFO' }, context: {}, }); t.match(logs[1], { data: ['something else'], categoryName: 'test-category', level: { levelStr: 'INFO' }, context: {}, }); t.match(logs[2], { data: [ 'Unable to parse log:', 'some nonsense', 'because: ', SyntaxError, ], categoryName: 'log4js', level: { levelStr: 'ERROR' }, context: {}, }); t.match(logs[3], { data: [ 'Unable to parse log:', '{"some":"json"}', 'because: ', TypeError, ], categoryName: 'log4js', level: { levelStr: 'ERROR' }, context: {}, }); t.end(); }); }, 100); } ); }); socket.unref(); }, 100); } ); batch.test('sending incomplete messages in chunks', (t) => { log4js.configure({ appenders: { vcr: { type: 'recording' }, tcp: { type: 'tcp-server' }, }, categories: { default: { appenders: ['vcr'], level: 'debug' }, }, }); // give the socket a chance to start up setTimeout(() => { const socket = net.connect(5000, () => { const syncWrite = (dataArray, finalCallback) => { if (!Array.isArray(dataArray)) { dataArray = [dataArray]; } if (typeof finalCallback !== 'function') { finalCallback = () => {}; } setTimeout(() => { if (!dataArray.length) { finalCallback(); } else if (dataArray.length === 1) { socket.write(dataArray.shift(), finalCallback); } else { socket.write(dataArray.shift(), () => { syncWrite(dataArray, finalCallback); }); } }, 100); }; const dataArray = [ '__LOG4JS__', 'Hello__LOG4JS__World', '__LOG4JS__', 'testing nonsense', `__LOG4JS__more nonsense__LOG4JS__`, ]; const finalCallback = () => { socket.end(); setTimeout(() => { log4js.shutdown(() => { const logs = vcr.replay(); t.equal(logs.length, 8); t.match(logs[4], { data: [ 'Unable to parse log:', 'Hello', 'because: ', SyntaxError, ], categoryName: 'log4js', level: { levelStr: 'ERROR' }, context: {}, }); t.match(logs[5], { data: [ 'Unable to parse log:', 'World', 'because: ', SyntaxError, ], categoryName: 'log4js', level: { levelStr: 'ERROR' }, context: {}, }); t.match(logs[6], { data: [ 'Unable to parse log:', 'testing nonsense', 'because: ', SyntaxError, ], categoryName: 'log4js', level: { levelStr: 'ERROR' }, context: {}, }); t.match(logs[7], { data: [ 'Unable to parse log:', 'more nonsense', 'because: ', SyntaxError, ], categoryName: 'log4js', level: { levelStr: 'ERROR' }, context: {}, }); t.end(); }); }, 100); }; syncWrite(dataArray, finalCallback); }); socket.unref(); }, 100); }); batch.end(); });
-1
log4js-node/log4js-node
1,332
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`
Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
lamweili
"2022-10-01T10:52:35Z"
"2022-10-01T10:57:05Z"
916eef11f1d2aa2f32765f956f1f674745feb8b6
570ef530dc02d3e843a5421cb015bb8fadfe0b41
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`. Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
./test/tap/multiprocess-shutdown-test.js
const { test } = require('tap'); const net = require('net'); const childProcess = require('child_process'); const sandbox = require('@log4js-node/sandboxed-module'); const log4js = require('../../lib/log4js'); test('multiprocess appender shutdown (master)', { timeout: 10000 }, (t) => { log4js.configure({ appenders: { stdout: { type: 'stdout' }, multi: { type: 'multiprocess', mode: 'master', loggerPort: 12345, appender: 'stdout', }, }, categories: { default: { appenders: ['multi'], level: 'debug' } }, }); setTimeout(() => { log4js.shutdown(() => { setTimeout(() => { net .connect({ port: 12345 }, () => { t.fail('connection should not still work'); t.end(); }) .on('error', (err) => { t.ok(err, 'we got a connection error'); t.end(); }); }, 1000); }); }, 1000); }); test('multiprocess appender shutdown (worker)', (t) => { const fakeConnection = { evts: {}, msgs: [], on(evt, cb) { this.evts[evt] = cb; }, write(data) { this.msgs.push(data); }, removeAllListeners() { this.removeAllListenersCalled = true; }, end(cb) { this.endCb = cb; }, }; const logLib = sandbox.require('../../lib/log4js', { requires: { net: { createConnection() { return fakeConnection; }, }, }, }); logLib.configure({ appenders: { worker: { type: 'multiprocess', mode: 'worker' } }, categories: { default: { appenders: ['worker'], level: 'debug' } }, }); logLib .getLogger() .info( 'Putting something in the buffer before the connection is established' ); // nothing been written yet. t.equal(fakeConnection.msgs.length, 0); let shutdownFinished = false; logLib.shutdown(() => { shutdownFinished = true; }); // still nothing been written yet. t.equal(fakeConnection.msgs.length, 0); fakeConnection.evts.connect(); setTimeout(() => { t.equal(fakeConnection.msgs.length, 2); t.ok(fakeConnection.removeAllListenersCalled); fakeConnection.endCb(); t.ok(shutdownFinished); t.end(); }, 500); }); test('multiprocess appender crash (worker)', (t) => { const loggerPort = 12346; const vcr = require('../../lib/appenders/recording'); log4js.configure({ appenders: { console: { type: 'recording' }, multi: { type: 'multiprocess', mode: 'master', loggerPort, appender: 'console', }, }, categories: { default: { appenders: ['multi'], level: 'debug' } }, }); const worker = childProcess.fork(require.resolve('../multiprocess-worker'), [ 'start-multiprocess-worker', loggerPort, ]); worker.on('message', (m) => { if (m === 'worker is done') { setTimeout(() => { worker.kill(); t.equal(vcr.replay()[0].data[0], 'Logging from worker'); log4js.shutdown(() => t.end()); }, 100); } }); });
const { test } = require('tap'); const net = require('net'); const childProcess = require('child_process'); const sandbox = require('@log4js-node/sandboxed-module'); const log4js = require('../../lib/log4js'); test('multiprocess appender shutdown (master)', { timeout: 10000 }, (t) => { log4js.configure({ appenders: { stdout: { type: 'stdout' }, multi: { type: 'multiprocess', mode: 'master', loggerPort: 12345, appender: 'stdout', }, }, categories: { default: { appenders: ['multi'], level: 'debug' } }, }); setTimeout(() => { log4js.shutdown(() => { setTimeout(() => { net .connect({ port: 12345 }, () => { t.fail('connection should not still work'); t.end(); }) .on('error', (err) => { t.ok(err, 'we got a connection error'); t.end(); }); }, 1000); }); }, 1000); }); test('multiprocess appender shutdown (worker)', (t) => { const fakeConnection = { evts: {}, msgs: [], on(evt, cb) { this.evts[evt] = cb; }, write(data) { this.msgs.push(data); }, removeAllListeners() { this.removeAllListenersCalled = true; }, end(cb) { this.endCb = cb; }, }; const logLib = sandbox.require('../../lib/log4js', { requires: { net: { createConnection() { return fakeConnection; }, }, }, }); logLib.configure({ appenders: { worker: { type: 'multiprocess', mode: 'worker' } }, categories: { default: { appenders: ['worker'], level: 'debug' } }, }); logLib .getLogger() .info( 'Putting something in the buffer before the connection is established' ); // nothing been written yet. t.equal(fakeConnection.msgs.length, 0); let shutdownFinished = false; logLib.shutdown(() => { shutdownFinished = true; }); // still nothing been written yet. t.equal(fakeConnection.msgs.length, 0); fakeConnection.evts.connect(); setTimeout(() => { t.equal(fakeConnection.msgs.length, 2); t.ok(fakeConnection.removeAllListenersCalled); fakeConnection.endCb(); t.ok(shutdownFinished); t.end(); }, 500); }); test('multiprocess appender crash (worker)', (t) => { const loggerPort = 12346; const vcr = require('../../lib/appenders/recording'); log4js.configure({ appenders: { console: { type: 'recording' }, multi: { type: 'multiprocess', mode: 'master', loggerPort, appender: 'console', }, }, categories: { default: { appenders: ['multi'], level: 'debug' } }, }); const worker = childProcess.fork(require.resolve('../multiprocess-worker'), [ 'start-multiprocess-worker', loggerPort, ]); worker.on('message', (m) => { if (m === 'worker is done') { setTimeout(() => { worker.kill(); t.equal(vcr.replay()[0].data[0], 'Logging from worker'); log4js.shutdown(() => t.end()); }, 100); } }); });
-1
log4js-node/log4js-node
1,332
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`
Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
lamweili
"2022-10-01T10:52:35Z"
"2022-10-01T10:57:05Z"
916eef11f1d2aa2f32765f956f1f674745feb8b6
570ef530dc02d3e843a5421cb015bb8fadfe0b41
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`. Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
./test/tap/layouts-test.js
const { test } = require('tap'); const debug = require('debug'); const os = require('os'); const path = require('path'); const { EOL } = os; // used for patternLayout tests. function testPattern(assert, layout, event, tokens, pattern, value) { assert.equal(layout(pattern, tokens)(event), value); } test('log4js layouts', (batch) => { batch.test('colouredLayout', (t) => { const layout = require('../../lib/layouts').colouredLayout; t.test('should apply level colour codes to output', (assert) => { const output = layout({ data: ['nonsense'], startTime: new Date(2010, 11, 5, 14, 18, 30, 45), categoryName: 'cheese', level: { toString() { return 'ERROR'; }, colour: 'red', }, }); assert.equal( output, '\x1B[91m[2010-12-05T14:18:30.045] [ERROR] cheese - \x1B[39mnonsense' ); assert.end(); }); t.test( 'should support the console.log format for the message', (assert) => { const output = layout({ data: ['thing %d', 2], startTime: new Date(2010, 11, 5, 14, 18, 30, 45), categoryName: 'cheese', level: { toString() { return 'ERROR'; }, colour: 'red', }, }); assert.equal( output, '\x1B[91m[2010-12-05T14:18:30.045] [ERROR] cheese - \x1B[39mthing 2' ); assert.end(); } ); t.end(); }); batch.test('messagePassThroughLayout', (t) => { const layout = require('../../lib/layouts').messagePassThroughLayout; t.equal( layout({ data: ['nonsense'], startTime: new Date(2010, 11, 5, 14, 18, 30, 45), categoryName: 'cheese', level: { colour: 'green', toString() { return 'ERROR'; }, }, }), 'nonsense', 'should take a logevent and output only the message' ); t.equal( layout({ data: ['thing %d', 1, 'cheese'], startTime: new Date(2010, 11, 5, 14, 18, 30, 45), categoryName: 'cheese', level: { colour: 'green', toString() { return 'ERROR'; }, }, }), 'thing 1 cheese', 'should support the console.log format for the message' ); t.equal( layout({ data: [{ thing: 1 }], startTime: new Date(2010, 11, 5, 14, 18, 30, 45), categoryName: 'cheese', level: { colour: 'green', toString() { return 'ERROR'; }, }, }), '{ thing: 1 }', 'should output the first item even if it is not a string' ); t.match( layout({ data: [new Error()], startTime: new Date(2010, 11, 5, 14, 18, 30, 45), categoryName: 'cheese', level: { colour: 'green', toString() { return 'ERROR'; }, }, }), /at (Test\.batch\.test(\.t)?|Test\.<anonymous>)\s+\((.*)test[\\/]tap[\\/]layouts-test\.js:\d+:\d+\)/, 'regexp did not return a match - should print the stacks of a passed error objects' ); t.test('with passed augmented errors', (assert) => { const e = new Error('My Unique Error Message'); e.augmented = 'My Unique attribute value'; e.augObj = { at1: 'at2' }; const layoutOutput = layout({ data: [e], startTime: new Date(2010, 11, 5, 14, 18, 30, 45), categoryName: 'cheese', level: { colour: 'green', toString() { return 'ERROR'; }, }, }); assert.match( layoutOutput, /Error: My Unique Error Message/, 'should print the contained error message' ); assert.match( layoutOutput, /augmented:\s'My Unique attribute value'/, 'should print error augmented string attributes' ); assert.match( layoutOutput, /augObj:\s\{ at1: 'at2' \}/, 'should print error augmented object attributes' ); assert.end(); }); t.end(); }); batch.test('basicLayout', (t) => { const layout = require('../../lib/layouts').basicLayout; const event = { data: ['this is a test'], startTime: new Date(2010, 11, 5, 14, 18, 30, 45), categoryName: 'tests', level: { toString() { return 'DEBUG'; }, }, }; t.equal( layout(event), '[2010-12-05T14:18:30.045] [DEBUG] tests - this is a test' ); t.test( 'should output a stacktrace, message if the event has an error attached', (assert) => { let i; const error = new Error('Some made-up error'); const stack = error.stack.split(/\n/); event.data = ['this is a test', error]; const output = layout(event); const lines = output.split(/\n/); assert.equal(lines.length, stack.length); assert.equal( lines[0], '[2010-12-05T14:18:30.045] [DEBUG] tests - this is a test Error: Some made-up error' ); for (i = 1; i < stack.length; i++) { assert.equal(lines[i], stack[i]); } assert.end(); } ); t.test( 'should output any extra data in the log event as util.inspect strings', (assert) => { event.data = [ 'this is a test', { name: 'Cheese', message: 'Gorgonzola smells.', }, ]; const output = layout(event); assert.equal( output, '[2010-12-05T14:18:30.045] [DEBUG] tests - this is a test ' + "{ name: 'Cheese', message: 'Gorgonzola smells.' }" ); assert.end(); } ); t.end(); }); batch.test('dummyLayout', (t) => { const layout = require('../../lib/layouts').dummyLayout; t.test('should output just the first element of the log data', (assert) => { const event = { data: ['this is the first value', 'this is not'], startTime: new Date('2010-12-05 14:18:30.045'), categoryName: 'multiple.levels.of.tests', level: { toString() { return 'DEBUG'; }, colour: 'cyan', }, }; assert.equal(layout(event), 'this is the first value'); assert.end(); }); t.end(); }); batch.test('patternLayout', (t) => { const originalListener = process.listeners('warning')[process.listeners('warning').length - 1]; const warningListener = (error) => { if (error.name === 'DeprecationWarning') { if ( error.code.startsWith('log4js-node-DEP0003') || error.code.startsWith('log4js-node-DEP0004') ) { return; } } originalListener(error); }; process.off('warning', originalListener); process.on('warning', warningListener); const debugWasEnabled = debug.enabled('log4js:layouts'); const debugLogs = []; const originalWrite = process.stderr.write; process.stderr.write = (string, encoding, fd) => { debugLogs.push(string); if (debugWasEnabled) { originalWrite.apply(process.stderr, [string, encoding, fd]); } }; const originalNamespace = debug.disable(); debug.enable(`${originalNamespace}, log4js:layouts`); batch.teardown(async () => { // next event loop so that past warnings will not be printed setImmediate(() => { process.off('warning', warningListener); process.on('warning', originalListener); }); process.stderr.write = originalWrite; debug.enable(originalNamespace); }); const tokens = { testString: 'testStringToken', testFunction() { return 'testFunctionToken'; }, fnThatUsesLogEvent(logEvent) { return logEvent.level.toString(); }, }; // console.log([Error('123').stack.split('\n').slice(1).join('\n')]) const callStack = ' at Foo.bar [as baz] (repl:1:14)\n at ContextifyScript.Script.runInThisContext (vm.js:50:33)\n at REPLServer.defaultEval (repl.js:240:29)\n at bound (domain.js:301:14)\n at REPLServer.runBound [as eval] (domain.js:314:12)\n at REPLServer.onLine (repl.js:468:10)\n at emitOne (events.js:121:20)\n at REPLServer.emit (events.js:211:7)\n at REPLServer.Interface._onLine (readline.js:280:10)\n at REPLServer.Interface._line (readline.js:629:8)'; // eslint-disable-line max-len const fileName = path.normalize('/log4js-node/test/tap/layouts-test.js'); const lineNumber = 1; const columnNumber = 14; const className = 'Foo'; const functionName = 'bar'; const functionAlias = 'baz'; const callerName = 'Foo.bar [as baz]'; const event = { data: ['this is a test'], startTime: new Date('2010-12-05 14:18:30.045'), categoryName: 'multiple.levels.of.tests', level: { toString() { return 'DEBUG'; }, colour: 'cyan', }, context: tokens, // location callStack, fileName, lineNumber, columnNumber, className, functionName, functionAlias, callerName, }; event.startTime.getTimezoneOffset = () => -600; const layout = require('../../lib/layouts').patternLayout; t.test( 'should default to "time logLevel loggerName - message"', (assert) => { testPattern( assert, layout, event, tokens, null, `14:18:30 DEBUG multiple.levels.of.tests - this is a test${EOL}` ); assert.end(); } ); t.test('%r should output time only', (assert) => { testPattern(assert, layout, event, tokens, '%r', '14:18:30'); assert.end(); }); t.test('%p should output the log level', (assert) => { testPattern(assert, layout, event, tokens, '%p', 'DEBUG'); assert.end(); }); t.test('%c should output the log category', (assert) => { testPattern( assert, layout, event, tokens, '%c', 'multiple.levels.of.tests' ); assert.end(); }); t.test('%m should output the log data', (assert) => { testPattern(assert, layout, event, tokens, '%m', 'this is a test'); assert.end(); }); t.test('%n should output a new line', (assert) => { testPattern(assert, layout, event, tokens, '%n', EOL); assert.end(); }); t.test('%h should output hostname', (assert) => { testPattern( assert, layout, event, tokens, '%h', os.hostname().toString() ); assert.end(); }); t.test('%z should output pid', (assert) => { testPattern(assert, layout, event, tokens, '%z', process.pid.toString()); assert.end(); }); t.test('%z should pick up pid from log event if present', (assert) => { event.pid = '1234'; testPattern(assert, layout, event, tokens, '%z', '1234'); delete event.pid; assert.end(); }); t.test('%y should output pid (was cluster info)', (assert) => { testPattern(assert, layout, event, tokens, '%y', process.pid.toString()); assert.end(); }); t.test( '%c should handle category names like java-style package names', (assert) => { testPattern(assert, layout, event, tokens, '%c{1}', 'tests'); testPattern(assert, layout, event, tokens, '%c{2}', 'of.tests'); testPattern(assert, layout, event, tokens, '%c{3}', 'levels.of.tests'); testPattern( assert, layout, event, tokens, '%c{4}', 'multiple.levels.of.tests' ); testPattern( assert, layout, event, tokens, '%c{5}', 'multiple.levels.of.tests' ); testPattern( assert, layout, event, tokens, '%c{99}', 'multiple.levels.of.tests' ); assert.end(); } ); t.test('%d should output the date in ISO8601 format', (assert) => { testPattern( assert, layout, event, tokens, '%d', '2010-12-05T14:18:30.045' ); assert.end(); }); t.test('%d should allow for format specification', (assert) => { testPattern( assert, layout, event, tokens, '%d{ISO8601}', '2010-12-05T14:18:30.045' ); testPattern( assert, layout, event, tokens, '%d{ISO8601_WITH_TZ_OFFSET}', '2010-12-05T14:18:30.045+10:00' ); const DEP0003 = debugLogs.filter( (e) => e.indexOf('log4js-node-DEP0003') > -1 ).length; testPattern( assert, layout, event, tokens, '%d{ABSOLUTE}', // deprecated '14:18:30.045' ); assert.equal( debugLogs.filter((e) => e.indexOf('log4js-node-DEP0003') > -1).length, DEP0003 + 1, 'deprecation log4js-node-DEP0003 emitted' ); testPattern( assert, layout, event, tokens, '%d{ABSOLUTETIME}', '14:18:30.045' ); const DEP0004 = debugLogs.filter( (e) => e.indexOf('log4js-node-DEP0004') > -1 ).length; testPattern( assert, layout, event, tokens, '%d{DATE}', // deprecated '05 12 2010 14:18:30.045' ); assert.equal( debugLogs.filter((e) => e.indexOf('log4js-node-DEP0004') > -1).length, DEP0004 + 1, 'deprecation log4js-node-DEP0004 emitted' ); testPattern( assert, layout, event, tokens, '%d{DATETIME}', '05 12 2010 14:18:30.045' ); testPattern( assert, layout, event, tokens, '%d{yy MM dd hh mm ss}', '10 12 05 14 18 30' ); testPattern( assert, layout, event, tokens, '%d{yyyy MM dd}', '2010 12 05' ); testPattern( assert, layout, event, tokens, '%d{yyyy MM dd hh mm ss SSS}', '2010 12 05 14 18 30 045' ); assert.end(); }); t.test('%% should output %', (assert) => { testPattern(assert, layout, event, tokens, '%%', '%'); assert.end(); }); t.test('%f should output filename', (assert) => { testPattern(assert, layout, event, tokens, '%f', fileName); assert.end(); }); t.test('%f should handle filename depth', (assert) => { testPattern(assert, layout, event, tokens, '%f{1}', 'layouts-test.js'); testPattern( assert, layout, event, tokens, '%f{2}', path.join('tap', 'layouts-test.js') ); testPattern( assert, layout, event, tokens, '%f{3}', path.join('test', 'tap', 'layouts-test.js') ); testPattern( assert, layout, event, tokens, '%f{4}', path.join('log4js-node', 'test', 'tap', 'layouts-test.js') ); testPattern( assert, layout, event, tokens, '%f{5}', path.join('/log4js-node', 'test', 'tap', 'layouts-test.js') ); testPattern( assert, layout, event, tokens, '%f{99}', path.join('/log4js-node', 'test', 'tap', 'layouts-test.js') ); assert.end(); }); t.test('%f should accept truncation and padding', (assert) => { testPattern(assert, layout, event, tokens, '%.5f', fileName.slice(0, 5)); testPattern( assert, layout, event, tokens, '%20f{1}', ' layouts-test.js' ); testPattern( assert, layout, event, tokens, '%30.30f{2}', ` ${path.join('tap', 'layouts-test.js')}` ); testPattern(assert, layout, event, tokens, '%10.-5f{1}', ' st.js'); assert.end(); }); t.test('%l should output line number', (assert) => { testPattern(assert, layout, event, tokens, '%l', lineNumber.toString()); assert.end(); }); t.test('%l should accept truncation and padding', (assert) => { testPattern(assert, layout, event, tokens, '%5.10l', ' 1'); testPattern(assert, layout, event, tokens, '%.5l', '1'); testPattern(assert, layout, event, tokens, '%.-5l', '1'); testPattern(assert, layout, event, tokens, '%-5l', '1 '); assert.end(); }); t.test('%o should output column postion', (assert) => { testPattern(assert, layout, event, tokens, '%o', columnNumber.toString()); assert.end(); }); t.test('%o should accept truncation and padding', (assert) => { testPattern(assert, layout, event, tokens, '%5.10o', ' 14'); testPattern(assert, layout, event, tokens, '%.5o', '14'); testPattern(assert, layout, event, tokens, '%.1o', '1'); testPattern(assert, layout, event, tokens, '%.-1o', '4'); testPattern(assert, layout, event, tokens, '%-5o', '14 '); assert.end(); }); t.test('%s should output stack', (assert) => { testPattern(assert, layout, event, tokens, '%s', callStack); assert.end(); }); t.test( '%f should output empty string when fileName not exist', (assert) => { delete event.fileName; testPattern(assert, layout, event, tokens, '%f', ''); assert.end(); } ); t.test( '%l should output empty string when lineNumber not exist', (assert) => { delete event.lineNumber; testPattern(assert, layout, event, tokens, '%l', ''); assert.end(); } ); t.test( '%o should output empty string when columnNumber not exist', (assert) => { delete event.columnNumber; testPattern(assert, layout, event, tokens, '%o', ''); assert.end(); } ); t.test( '%s should output empty string when callStack not exist', (assert) => { delete event.callStack; testPattern(assert, layout, event, tokens, '%s', ''); assert.end(); } ); t.test('should output anything not preceded by % as literal', (assert) => { testPattern( assert, layout, event, tokens, 'blah blah blah', 'blah blah blah' ); assert.end(); }); t.test( 'should output the original string if no replacer matches the token', (assert) => { testPattern(assert, layout, event, tokens, '%a{3}', 'a{3}'); assert.end(); } ); t.test('should handle complicated patterns', (assert) => { testPattern( assert, layout, event, tokens, '%m%n %c{2} at %d{ABSOLUTE} cheese %p%n', // deprecated `this is a test${EOL} of.tests at 14:18:30.045 cheese DEBUG${EOL}` ); testPattern( assert, layout, event, tokens, '%m%n %c{2} at %d{ABSOLUTETIME} cheese %p%n', `this is a test${EOL} of.tests at 14:18:30.045 cheese DEBUG${EOL}` ); assert.end(); }); t.test('should truncate fields if specified', (assert) => { testPattern(assert, layout, event, tokens, '%.4m', 'this'); testPattern(assert, layout, event, tokens, '%.7m', 'this is'); testPattern(assert, layout, event, tokens, '%.9m', 'this is a'); testPattern(assert, layout, event, tokens, '%.14m', 'this is a test'); testPattern( assert, layout, event, tokens, '%.2919102m', 'this is a test' ); testPattern(assert, layout, event, tokens, '%.-4m', 'test'); assert.end(); }); t.test('should pad fields if specified', (assert) => { testPattern(assert, layout, event, tokens, '%10p', ' DEBUG'); testPattern(assert, layout, event, tokens, '%8p', ' DEBUG'); testPattern(assert, layout, event, tokens, '%6p', ' DEBUG'); testPattern(assert, layout, event, tokens, '%4p', 'DEBUG'); testPattern(assert, layout, event, tokens, '%-4p', 'DEBUG'); testPattern(assert, layout, event, tokens, '%-6p', 'DEBUG '); testPattern(assert, layout, event, tokens, '%-8p', 'DEBUG '); testPattern(assert, layout, event, tokens, '%-10p', 'DEBUG '); assert.end(); }); t.test('%[%r%] should output colored time', (assert) => { testPattern( assert, layout, event, tokens, '%[%r%]', '\x1B[36m14:18:30\x1B[39m' ); assert.end(); }); t.test( '%x{testString} should output the string stored in tokens', (assert) => { testPattern( assert, layout, event, tokens, '%x{testString}', 'testStringToken' ); assert.end(); } ); t.test( '%x{testFunction} should output the result of the function stored in tokens', (assert) => { testPattern( assert, layout, event, tokens, '%x{testFunction}', 'testFunctionToken' ); assert.end(); } ); t.test( '%x{doesNotExist} should output the string stored in tokens', (assert) => { testPattern(assert, layout, event, tokens, '%x{doesNotExist}', 'null'); assert.end(); } ); t.test( '%x{fnThatUsesLogEvent} should be able to use the logEvent', (assert) => { testPattern( assert, layout, event, tokens, '%x{fnThatUsesLogEvent}', 'DEBUG' ); assert.end(); } ); t.test('%x should output the string stored in tokens', (assert) => { testPattern(assert, layout, event, tokens, '%x', 'null'); assert.end(); }); t.test( '%X{testString} should output the string stored in tokens', (assert) => { testPattern( assert, layout, event, {}, '%X{testString}', 'testStringToken' ); assert.end(); } ); t.test( '%X{testFunction} should output the result of the function stored in tokens', (assert) => { testPattern( assert, layout, event, {}, '%X{testFunction}', 'testFunctionToken' ); assert.end(); } ); t.test( '%X{doesNotExist} should output the string stored in tokens', (assert) => { testPattern(assert, layout, event, {}, '%X{doesNotExist}', 'null'); assert.end(); } ); t.test( '%X{fnThatUsesLogEvent} should be able to use the logEvent', (assert) => { testPattern( assert, layout, event, {}, '%X{fnThatUsesLogEvent}', 'DEBUG' ); assert.end(); } ); t.test('%X should output the string stored in tokens', (assert) => { testPattern(assert, layout, event, {}, '%X', 'null'); assert.end(); }); t.test('%M should output function name', (assert) => { testPattern(assert, layout, event, tokens, '%M', functionName); assert.end(); }); t.test( '%M should output empty string when functionName not exist', (assert) => { delete event.functionName; testPattern(assert, layout, event, tokens, '%M', ''); assert.end(); } ); t.test('%C should output class name', (assert) => { testPattern(assert, layout, event, tokens, '%C', className); assert.end(); }); t.test( '%C should output empty string when className not exist', (assert) => { delete event.className; testPattern(assert, layout, event, tokens, '%C', ''); assert.end(); } ); t.test('%A should output function alias', (assert) => { testPattern(assert, layout, event, tokens, '%A', functionAlias); assert.end(); }); t.test( '%A should output empty string when functionAlias not exist', (assert) => { delete event.functionAlias; testPattern(assert, layout, event, tokens, '%A', ''); assert.end(); } ); t.test('%F should output fully qualified caller name', (assert) => { testPattern(assert, layout, event, tokens, '%F', callerName); assert.end(); }); t.test( '%F should output empty string when callerName not exist', (assert) => { delete event.callerName; testPattern(assert, layout, event, tokens, '%F', ''); assert.end(); } ); t.end(); }); batch.test('layout makers', (t) => { const layouts = require('../../lib/layouts'); t.test('should have a maker for each layout', (assert) => { assert.ok(layouts.layout('messagePassThrough')); assert.ok(layouts.layout('basic')); assert.ok(layouts.layout('colored')); assert.ok(layouts.layout('coloured')); assert.ok(layouts.layout('pattern')); assert.ok(layouts.layout('dummy')); assert.end(); }); t.test( 'layout pattern maker should pass pattern and tokens to layout from config', (assert) => { let layout = layouts.layout('pattern', { pattern: '%%' }); assert.equal(layout({}), '%'); layout = layouts.layout('pattern', { pattern: '%x{testStringToken}', tokens: { testStringToken: 'cheese' }, }); assert.equal(layout({}), 'cheese'); assert.end(); } ); t.end(); }); batch.test('add layout', (t) => { const layouts = require('../../lib/layouts'); t.test('should be able to add a layout', (assert) => { layouts.addLayout('test_layout', (config) => { assert.equal(config, 'test_config'); return function (logEvent) { return `TEST LAYOUT >${logEvent.data}`; }; }); const serializer = layouts.layout('test_layout', 'test_config'); assert.ok(serializer); assert.equal(serializer({ data: 'INPUT' }), 'TEST LAYOUT >INPUT'); assert.end(); }); t.end(); }); batch.end(); });
const { test } = require('tap'); const debug = require('debug'); const os = require('os'); const path = require('path'); const { EOL } = os; // used for patternLayout tests. function testPattern(assert, layout, event, tokens, pattern, value) { assert.equal(layout(pattern, tokens)(event), value); } test('log4js layouts', (batch) => { batch.test('colouredLayout', (t) => { const layout = require('../../lib/layouts').colouredLayout; t.test('should apply level colour codes to output', (assert) => { const output = layout({ data: ['nonsense'], startTime: new Date(2010, 11, 5, 14, 18, 30, 45), categoryName: 'cheese', level: { toString() { return 'ERROR'; }, colour: 'red', }, }); assert.equal( output, '\x1B[91m[2010-12-05T14:18:30.045] [ERROR] cheese - \x1B[39mnonsense' ); assert.end(); }); t.test( 'should support the console.log format for the message', (assert) => { const output = layout({ data: ['thing %d', 2], startTime: new Date(2010, 11, 5, 14, 18, 30, 45), categoryName: 'cheese', level: { toString() { return 'ERROR'; }, colour: 'red', }, }); assert.equal( output, '\x1B[91m[2010-12-05T14:18:30.045] [ERROR] cheese - \x1B[39mthing 2' ); assert.end(); } ); t.end(); }); batch.test('messagePassThroughLayout', (t) => { const layout = require('../../lib/layouts').messagePassThroughLayout; t.equal( layout({ data: ['nonsense'], startTime: new Date(2010, 11, 5, 14, 18, 30, 45), categoryName: 'cheese', level: { colour: 'green', toString() { return 'ERROR'; }, }, }), 'nonsense', 'should take a logevent and output only the message' ); t.equal( layout({ data: ['thing %d', 1, 'cheese'], startTime: new Date(2010, 11, 5, 14, 18, 30, 45), categoryName: 'cheese', level: { colour: 'green', toString() { return 'ERROR'; }, }, }), 'thing 1 cheese', 'should support the console.log format for the message' ); t.equal( layout({ data: [{ thing: 1 }], startTime: new Date(2010, 11, 5, 14, 18, 30, 45), categoryName: 'cheese', level: { colour: 'green', toString() { return 'ERROR'; }, }, }), '{ thing: 1 }', 'should output the first item even if it is not a string' ); t.match( layout({ data: [new Error()], startTime: new Date(2010, 11, 5, 14, 18, 30, 45), categoryName: 'cheese', level: { colour: 'green', toString() { return 'ERROR'; }, }, }), /at (Test\.batch\.test(\.t)?|Test\.<anonymous>)\s+\((.*)test[\\/]tap[\\/]layouts-test\.js:\d+:\d+\)/, 'regexp did not return a match - should print the stacks of a passed error objects' ); t.test('with passed augmented errors', (assert) => { const e = new Error('My Unique Error Message'); e.augmented = 'My Unique attribute value'; e.augObj = { at1: 'at2' }; const layoutOutput = layout({ data: [e], startTime: new Date(2010, 11, 5, 14, 18, 30, 45), categoryName: 'cheese', level: { colour: 'green', toString() { return 'ERROR'; }, }, }); assert.match( layoutOutput, /Error: My Unique Error Message/, 'should print the contained error message' ); assert.match( layoutOutput, /augmented:\s'My Unique attribute value'/, 'should print error augmented string attributes' ); assert.match( layoutOutput, /augObj:\s\{ at1: 'at2' \}/, 'should print error augmented object attributes' ); assert.end(); }); t.end(); }); batch.test('basicLayout', (t) => { const layout = require('../../lib/layouts').basicLayout; const event = { data: ['this is a test'], startTime: new Date(2010, 11, 5, 14, 18, 30, 45), categoryName: 'tests', level: { toString() { return 'DEBUG'; }, }, }; t.equal( layout(event), '[2010-12-05T14:18:30.045] [DEBUG] tests - this is a test' ); t.test( 'should output a stacktrace, message if the event has an error attached', (assert) => { let i; const error = new Error('Some made-up error'); const stack = error.stack.split(/\n/); event.data = ['this is a test', error]; const output = layout(event); const lines = output.split(/\n/); assert.equal(lines.length, stack.length); assert.equal( lines[0], '[2010-12-05T14:18:30.045] [DEBUG] tests - this is a test Error: Some made-up error' ); for (i = 1; i < stack.length; i++) { assert.equal(lines[i], stack[i]); } assert.end(); } ); t.test( 'should output any extra data in the log event as util.inspect strings', (assert) => { event.data = [ 'this is a test', { name: 'Cheese', message: 'Gorgonzola smells.', }, ]; const output = layout(event); assert.equal( output, '[2010-12-05T14:18:30.045] [DEBUG] tests - this is a test ' + "{ name: 'Cheese', message: 'Gorgonzola smells.' }" ); assert.end(); } ); t.end(); }); batch.test('dummyLayout', (t) => { const layout = require('../../lib/layouts').dummyLayout; t.test('should output just the first element of the log data', (assert) => { const event = { data: ['this is the first value', 'this is not'], startTime: new Date('2010-12-05 14:18:30.045'), categoryName: 'multiple.levels.of.tests', level: { toString() { return 'DEBUG'; }, colour: 'cyan', }, }; assert.equal(layout(event), 'this is the first value'); assert.end(); }); t.end(); }); batch.test('patternLayout', (t) => { const originalListener = process.listeners('warning')[process.listeners('warning').length - 1]; const warningListener = (error) => { if (error.name === 'DeprecationWarning') { if ( error.code.startsWith('log4js-node-DEP0003') || error.code.startsWith('log4js-node-DEP0004') ) { return; } } originalListener(error); }; process.off('warning', originalListener); process.on('warning', warningListener); const debugWasEnabled = debug.enabled('log4js:layouts'); const debugLogs = []; const originalWrite = process.stderr.write; process.stderr.write = (string, encoding, fd) => { debugLogs.push(string); if (debugWasEnabled) { originalWrite.apply(process.stderr, [string, encoding, fd]); } }; const originalNamespace = debug.disable(); debug.enable(`${originalNamespace}, log4js:layouts`); batch.teardown(async () => { // next event loop so that past warnings will not be printed setImmediate(() => { process.off('warning', warningListener); process.on('warning', originalListener); }); process.stderr.write = originalWrite; debug.enable(originalNamespace); }); const tokens = { testString: 'testStringToken', testFunction() { return 'testFunctionToken'; }, fnThatUsesLogEvent(logEvent) { return logEvent.level.toString(); }, }; // console.log([Error('123').stack.split('\n').slice(1).join('\n')]) const callStack = ' at Foo.bar [as baz] (repl:1:14)\n at ContextifyScript.Script.runInThisContext (vm.js:50:33)\n at REPLServer.defaultEval (repl.js:240:29)\n at bound (domain.js:301:14)\n at REPLServer.runBound [as eval] (domain.js:314:12)\n at REPLServer.onLine (repl.js:468:10)\n at emitOne (events.js:121:20)\n at REPLServer.emit (events.js:211:7)\n at REPLServer.Interface._onLine (readline.js:280:10)\n at REPLServer.Interface._line (readline.js:629:8)'; // eslint-disable-line max-len const fileName = path.normalize('/log4js-node/test/tap/layouts-test.js'); const lineNumber = 1; const columnNumber = 14; const className = 'Foo'; const functionName = 'bar'; const functionAlias = 'baz'; const callerName = 'Foo.bar [as baz]'; const event = { data: ['this is a test'], startTime: new Date('2010-12-05 14:18:30.045'), categoryName: 'multiple.levels.of.tests', level: { toString() { return 'DEBUG'; }, colour: 'cyan', }, context: tokens, // location callStack, fileName, lineNumber, columnNumber, className, functionName, functionAlias, callerName, }; event.startTime.getTimezoneOffset = () => -600; const layout = require('../../lib/layouts').patternLayout; t.test( 'should default to "time logLevel loggerName - message"', (assert) => { testPattern( assert, layout, event, tokens, null, `14:18:30 DEBUG multiple.levels.of.tests - this is a test${EOL}` ); assert.end(); } ); t.test('%r should output time only', (assert) => { testPattern(assert, layout, event, tokens, '%r', '14:18:30'); assert.end(); }); t.test('%p should output the log level', (assert) => { testPattern(assert, layout, event, tokens, '%p', 'DEBUG'); assert.end(); }); t.test('%c should output the log category', (assert) => { testPattern( assert, layout, event, tokens, '%c', 'multiple.levels.of.tests' ); assert.end(); }); t.test('%m should output the log data', (assert) => { testPattern(assert, layout, event, tokens, '%m', 'this is a test'); assert.end(); }); t.test('%n should output a new line', (assert) => { testPattern(assert, layout, event, tokens, '%n', EOL); assert.end(); }); t.test('%h should output hostname', (assert) => { testPattern( assert, layout, event, tokens, '%h', os.hostname().toString() ); assert.end(); }); t.test('%z should output pid', (assert) => { testPattern(assert, layout, event, tokens, '%z', process.pid.toString()); assert.end(); }); t.test('%z should pick up pid from log event if present', (assert) => { event.pid = '1234'; testPattern(assert, layout, event, tokens, '%z', '1234'); delete event.pid; assert.end(); }); t.test('%y should output pid (was cluster info)', (assert) => { testPattern(assert, layout, event, tokens, '%y', process.pid.toString()); assert.end(); }); t.test( '%c should handle category names like java-style package names', (assert) => { testPattern(assert, layout, event, tokens, '%c{1}', 'tests'); testPattern(assert, layout, event, tokens, '%c{2}', 'of.tests'); testPattern(assert, layout, event, tokens, '%c{3}', 'levels.of.tests'); testPattern( assert, layout, event, tokens, '%c{4}', 'multiple.levels.of.tests' ); testPattern( assert, layout, event, tokens, '%c{5}', 'multiple.levels.of.tests' ); testPattern( assert, layout, event, tokens, '%c{99}', 'multiple.levels.of.tests' ); assert.end(); } ); t.test('%d should output the date in ISO8601 format', (assert) => { testPattern( assert, layout, event, tokens, '%d', '2010-12-05T14:18:30.045' ); assert.end(); }); t.test('%d should allow for format specification', (assert) => { testPattern( assert, layout, event, tokens, '%d{ISO8601}', '2010-12-05T14:18:30.045' ); testPattern( assert, layout, event, tokens, '%d{ISO8601_WITH_TZ_OFFSET}', '2010-12-05T14:18:30.045+10:00' ); const DEP0003 = debugLogs.filter( (e) => e.indexOf('log4js-node-DEP0003') > -1 ).length; testPattern( assert, layout, event, tokens, '%d{ABSOLUTE}', // deprecated '14:18:30.045' ); assert.equal( debugLogs.filter((e) => e.indexOf('log4js-node-DEP0003') > -1).length, DEP0003 + 1, 'deprecation log4js-node-DEP0003 emitted' ); testPattern( assert, layout, event, tokens, '%d{ABSOLUTETIME}', '14:18:30.045' ); const DEP0004 = debugLogs.filter( (e) => e.indexOf('log4js-node-DEP0004') > -1 ).length; testPattern( assert, layout, event, tokens, '%d{DATE}', // deprecated '05 12 2010 14:18:30.045' ); assert.equal( debugLogs.filter((e) => e.indexOf('log4js-node-DEP0004') > -1).length, DEP0004 + 1, 'deprecation log4js-node-DEP0004 emitted' ); testPattern( assert, layout, event, tokens, '%d{DATETIME}', '05 12 2010 14:18:30.045' ); testPattern( assert, layout, event, tokens, '%d{yy MM dd hh mm ss}', '10 12 05 14 18 30' ); testPattern( assert, layout, event, tokens, '%d{yyyy MM dd}', '2010 12 05' ); testPattern( assert, layout, event, tokens, '%d{yyyy MM dd hh mm ss SSS}', '2010 12 05 14 18 30 045' ); assert.end(); }); t.test('%% should output %', (assert) => { testPattern(assert, layout, event, tokens, '%%', '%'); assert.end(); }); t.test('%f should output filename', (assert) => { testPattern(assert, layout, event, tokens, '%f', fileName); assert.end(); }); t.test('%f should handle filename depth', (assert) => { testPattern(assert, layout, event, tokens, '%f{1}', 'layouts-test.js'); testPattern( assert, layout, event, tokens, '%f{2}', path.join('tap', 'layouts-test.js') ); testPattern( assert, layout, event, tokens, '%f{3}', path.join('test', 'tap', 'layouts-test.js') ); testPattern( assert, layout, event, tokens, '%f{4}', path.join('log4js-node', 'test', 'tap', 'layouts-test.js') ); testPattern( assert, layout, event, tokens, '%f{5}', path.join('/log4js-node', 'test', 'tap', 'layouts-test.js') ); testPattern( assert, layout, event, tokens, '%f{99}', path.join('/log4js-node', 'test', 'tap', 'layouts-test.js') ); assert.end(); }); t.test('%f should accept truncation and padding', (assert) => { testPattern(assert, layout, event, tokens, '%.5f', fileName.slice(0, 5)); testPattern( assert, layout, event, tokens, '%20f{1}', ' layouts-test.js' ); testPattern( assert, layout, event, tokens, '%30.30f{2}', ` ${path.join('tap', 'layouts-test.js')}` ); testPattern(assert, layout, event, tokens, '%10.-5f{1}', ' st.js'); assert.end(); }); t.test('%l should output line number', (assert) => { testPattern(assert, layout, event, tokens, '%l', lineNumber.toString()); assert.end(); }); t.test('%l should accept truncation and padding', (assert) => { testPattern(assert, layout, event, tokens, '%5.10l', ' 1'); testPattern(assert, layout, event, tokens, '%.5l', '1'); testPattern(assert, layout, event, tokens, '%.-5l', '1'); testPattern(assert, layout, event, tokens, '%-5l', '1 '); assert.end(); }); t.test('%o should output column postion', (assert) => { testPattern(assert, layout, event, tokens, '%o', columnNumber.toString()); assert.end(); }); t.test('%o should accept truncation and padding', (assert) => { testPattern(assert, layout, event, tokens, '%5.10o', ' 14'); testPattern(assert, layout, event, tokens, '%.5o', '14'); testPattern(assert, layout, event, tokens, '%.1o', '1'); testPattern(assert, layout, event, tokens, '%.-1o', '4'); testPattern(assert, layout, event, tokens, '%-5o', '14 '); assert.end(); }); t.test('%s should output stack', (assert) => { testPattern(assert, layout, event, tokens, '%s', callStack); assert.end(); }); t.test( '%f should output empty string when fileName not exist', (assert) => { delete event.fileName; testPattern(assert, layout, event, tokens, '%f', ''); assert.end(); } ); t.test( '%l should output empty string when lineNumber not exist', (assert) => { delete event.lineNumber; testPattern(assert, layout, event, tokens, '%l', ''); assert.end(); } ); t.test( '%o should output empty string when columnNumber not exist', (assert) => { delete event.columnNumber; testPattern(assert, layout, event, tokens, '%o', ''); assert.end(); } ); t.test( '%s should output empty string when callStack not exist', (assert) => { delete event.callStack; testPattern(assert, layout, event, tokens, '%s', ''); assert.end(); } ); t.test('should output anything not preceded by % as literal', (assert) => { testPattern( assert, layout, event, tokens, 'blah blah blah', 'blah blah blah' ); assert.end(); }); t.test( 'should output the original string if no replacer matches the token', (assert) => { testPattern(assert, layout, event, tokens, '%a{3}', 'a{3}'); assert.end(); } ); t.test('should handle complicated patterns', (assert) => { testPattern( assert, layout, event, tokens, '%m%n %c{2} at %d{ABSOLUTE} cheese %p%n', // deprecated `this is a test${EOL} of.tests at 14:18:30.045 cheese DEBUG${EOL}` ); testPattern( assert, layout, event, tokens, '%m%n %c{2} at %d{ABSOLUTETIME} cheese %p%n', `this is a test${EOL} of.tests at 14:18:30.045 cheese DEBUG${EOL}` ); assert.end(); }); t.test('should truncate fields if specified', (assert) => { testPattern(assert, layout, event, tokens, '%.4m', 'this'); testPattern(assert, layout, event, tokens, '%.7m', 'this is'); testPattern(assert, layout, event, tokens, '%.9m', 'this is a'); testPattern(assert, layout, event, tokens, '%.14m', 'this is a test'); testPattern( assert, layout, event, tokens, '%.2919102m', 'this is a test' ); testPattern(assert, layout, event, tokens, '%.-4m', 'test'); assert.end(); }); t.test('should pad fields if specified', (assert) => { testPattern(assert, layout, event, tokens, '%10p', ' DEBUG'); testPattern(assert, layout, event, tokens, '%8p', ' DEBUG'); testPattern(assert, layout, event, tokens, '%6p', ' DEBUG'); testPattern(assert, layout, event, tokens, '%4p', 'DEBUG'); testPattern(assert, layout, event, tokens, '%-4p', 'DEBUG'); testPattern(assert, layout, event, tokens, '%-6p', 'DEBUG '); testPattern(assert, layout, event, tokens, '%-8p', 'DEBUG '); testPattern(assert, layout, event, tokens, '%-10p', 'DEBUG '); assert.end(); }); t.test('%[%r%] should output colored time', (assert) => { testPattern( assert, layout, event, tokens, '%[%r%]', '\x1B[36m14:18:30\x1B[39m' ); assert.end(); }); t.test( '%x{testString} should output the string stored in tokens', (assert) => { testPattern( assert, layout, event, tokens, '%x{testString}', 'testStringToken' ); assert.end(); } ); t.test( '%x{testFunction} should output the result of the function stored in tokens', (assert) => { testPattern( assert, layout, event, tokens, '%x{testFunction}', 'testFunctionToken' ); assert.end(); } ); t.test( '%x{doesNotExist} should output the string stored in tokens', (assert) => { testPattern(assert, layout, event, tokens, '%x{doesNotExist}', 'null'); assert.end(); } ); t.test( '%x{fnThatUsesLogEvent} should be able to use the logEvent', (assert) => { testPattern( assert, layout, event, tokens, '%x{fnThatUsesLogEvent}', 'DEBUG' ); assert.end(); } ); t.test('%x should output the string stored in tokens', (assert) => { testPattern(assert, layout, event, tokens, '%x', 'null'); assert.end(); }); t.test( '%X{testString} should output the string stored in tokens', (assert) => { testPattern( assert, layout, event, {}, '%X{testString}', 'testStringToken' ); assert.end(); } ); t.test( '%X{testFunction} should output the result of the function stored in tokens', (assert) => { testPattern( assert, layout, event, {}, '%X{testFunction}', 'testFunctionToken' ); assert.end(); } ); t.test( '%X{doesNotExist} should output the string stored in tokens', (assert) => { testPattern(assert, layout, event, {}, '%X{doesNotExist}', 'null'); assert.end(); } ); t.test( '%X{fnThatUsesLogEvent} should be able to use the logEvent', (assert) => { testPattern( assert, layout, event, {}, '%X{fnThatUsesLogEvent}', 'DEBUG' ); assert.end(); } ); t.test('%X should output the string stored in tokens', (assert) => { testPattern(assert, layout, event, {}, '%X', 'null'); assert.end(); }); t.test('%M should output function name', (assert) => { testPattern(assert, layout, event, tokens, '%M', functionName); assert.end(); }); t.test( '%M should output empty string when functionName not exist', (assert) => { delete event.functionName; testPattern(assert, layout, event, tokens, '%M', ''); assert.end(); } ); t.test('%C should output class name', (assert) => { testPattern(assert, layout, event, tokens, '%C', className); assert.end(); }); t.test( '%C should output empty string when className not exist', (assert) => { delete event.className; testPattern(assert, layout, event, tokens, '%C', ''); assert.end(); } ); t.test('%A should output function alias', (assert) => { testPattern(assert, layout, event, tokens, '%A', functionAlias); assert.end(); }); t.test( '%A should output empty string when functionAlias not exist', (assert) => { delete event.functionAlias; testPattern(assert, layout, event, tokens, '%A', ''); assert.end(); } ); t.test('%F should output fully qualified caller name', (assert) => { testPattern(assert, layout, event, tokens, '%F', callerName); assert.end(); }); t.test( '%F should output empty string when callerName not exist', (assert) => { delete event.callerName; testPattern(assert, layout, event, tokens, '%F', ''); assert.end(); } ); t.end(); }); batch.test('layout makers', (t) => { const layouts = require('../../lib/layouts'); t.test('should have a maker for each layout', (assert) => { assert.ok(layouts.layout('messagePassThrough')); assert.ok(layouts.layout('basic')); assert.ok(layouts.layout('colored')); assert.ok(layouts.layout('coloured')); assert.ok(layouts.layout('pattern')); assert.ok(layouts.layout('dummy')); assert.end(); }); t.test( 'layout pattern maker should pass pattern and tokens to layout from config', (assert) => { let layout = layouts.layout('pattern', { pattern: '%%' }); assert.equal(layout({}), '%'); layout = layouts.layout('pattern', { pattern: '%x{testStringToken}', tokens: { testStringToken: 'cheese' }, }); assert.equal(layout({}), 'cheese'); assert.end(); } ); t.end(); }); batch.test('add layout', (t) => { const layouts = require('../../lib/layouts'); t.test('should be able to add a layout', (assert) => { layouts.addLayout('test_layout', (config) => { assert.equal(config, 'test_config'); return function (logEvent) { return `TEST LAYOUT >${logEvent.data}`; }; }); const serializer = layouts.layout('test_layout', 'test_config'); assert.ok(serializer); assert.equal(serializer({ data: 'INPUT' }), 'TEST LAYOUT >INPUT'); assert.end(); }); t.end(); }); batch.end(); });
-1
log4js-node/log4js-node
1,332
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`
Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
lamweili
"2022-10-01T10:52:35Z"
"2022-10-01T10:57:05Z"
916eef11f1d2aa2f32765f956f1f674745feb8b6
570ef530dc02d3e843a5421cb015bb8fadfe0b41
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`. Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
./test/tap/logger-test.js
const { test } = require('tap'); const debug = require('debug')('log4js:test.logger'); const sandbox = require('@log4js-node/sandboxed-module'); const callsites = require('callsites'); const levels = require('../../lib/levels'); const categories = require('../../lib/categories'); /** @type {import('../../types/log4js').LoggingEvent[]} */ const events = []; /** @type {string[]} */ const messages = []; /** * @typedef {import('../../types/log4js').Logger} LoggerClass */ /** @type {{new (): LoggerClass}} */ const Logger = sandbox.require('../../lib/logger', { requires: { './levels': levels, './categories': categories, './clustering': { isMaster: () => true, onlyOnMaster: (fn) => fn(), send: (evt) => { debug('fake clustering got event:', evt); events.push(evt); }, }, }, globals: { console: { ...console, error(msg) { messages.push(msg); }, }, }, }); const testConfig = { level: levels.TRACE, }; test('../../lib/logger', (batch) => { batch.beforeEach((done) => { events.length = 0; testConfig.level = levels.TRACE; if (typeof done === 'function') { done(); } }); batch.test('constructor with no parameters', (t) => { t.throws(() => new Logger(), new Error('No category provided.')); t.end(); }); batch.test('constructor with category', (t) => { const logger = new Logger('cheese'); t.equal(logger.category, 'cheese', 'should use category'); t.equal(logger.level, levels.OFF, 'should use OFF log level'); t.end(); }); batch.test('set level should delegate', (t) => { const logger = new Logger('cheese'); logger.level = 'debug'; t.equal(logger.category, 'cheese', 'should use category'); t.equal(logger.level, levels.DEBUG, 'should use level'); t.end(); }); batch.test('isLevelEnabled', (t) => { const logger = new Logger('cheese'); const functions = [ 'isTraceEnabled', 'isDebugEnabled', 'isInfoEnabled', 'isWarnEnabled', 'isErrorEnabled', 'isFatalEnabled', ]; t.test( 'should provide a level enabled function for all levels', (subtest) => { subtest.plan(functions.length); functions.forEach((fn) => { subtest.type(logger[fn], 'function'); }); } ); logger.level = 'INFO'; t.notOk(logger.isTraceEnabled()); t.notOk(logger.isDebugEnabled()); t.ok(logger.isInfoEnabled()); t.ok(logger.isWarnEnabled()); t.ok(logger.isErrorEnabled()); t.ok(logger.isFatalEnabled()); t.end(); }); batch.test('should send log events to dispatch function', (t) => { const logger = new Logger('cheese'); logger.level = 'debug'; logger.debug('Event 1'); logger.debug('Event 2'); logger.debug('Event 3'); t.equal(events.length, 3); t.equal(events[0].data[0], 'Event 1'); t.equal(events[1].data[0], 'Event 2'); t.equal(events[2].data[0], 'Event 3'); t.end(); }); batch.test('should add context values to every event', (t) => { const logger = new Logger('fromage'); logger.level = 'debug'; logger.debug('Event 1'); logger.addContext('cheese', 'edam'); logger.debug('Event 2'); logger.debug('Event 3'); logger.addContext('biscuits', 'timtam'); logger.debug('Event 4'); logger.removeContext('cheese'); logger.debug('Event 5'); logger.clearContext(); logger.debug('Event 6'); t.equal(events.length, 6); t.same(events[0].context, {}); t.same(events[1].context, { cheese: 'edam' }); t.same(events[2].context, { cheese: 'edam' }); t.same(events[3].context, { cheese: 'edam', biscuits: 'timtam' }); t.same(events[4].context, { biscuits: 'timtam' }); t.same(events[5].context, {}); t.end(); }); batch.test('should not break when log data has no toString', (t) => { const logger = new Logger('thing'); logger.level = 'debug'; logger.info('Just testing ', Object.create(null)); t.equal(events.length, 1); t.end(); }); batch.test( 'default should disable useCallStack unless manual enable', (t) => { const logger = new Logger('stack'); logger.level = 'debug'; t.equal(logger.useCallStack, false); logger.debug('test no callStack'); let event = events.shift(); t.notMatch(event, { functionName: String }); t.notMatch(event, { fileName: String }); t.notMatch(event, { lineNumber: Number }); t.notMatch(event, { columnNumber: Number }); t.notMatch(event, { callStack: String }); logger.useCallStack = false; t.equal(logger.useCallStack, false); logger.useCallStack = 0; t.equal(logger.useCallStack, false); logger.useCallStack = ''; t.equal(logger.useCallStack, false); logger.useCallStack = null; t.equal(logger.useCallStack, false); logger.useCallStack = undefined; t.equal(logger.useCallStack, false); logger.useCallStack = 'true'; t.equal(logger.useCallStack, false); logger.useCallStack = true; t.equal(logger.useCallStack, true); logger.debug('test with callStack'); event = events.shift(); t.match(event, { functionName: String, fileName: String, lineNumber: Number, columnNumber: Number, callStack: String, }); t.end(); } ); batch.test('should correctly switch on/off useCallStack', (t) => { const logger = new Logger('stack'); logger.level = 'debug'; logger.useCallStack = true; t.equal(logger.useCallStack, true); logger.info('hello world'); const callsite = callsites()[0]; t.equal(events.length, 1); t.equal(events[0].data[0], 'hello world'); t.equal(events[0].fileName, callsite.getFileName()); t.equal(events[0].lineNumber, callsite.getLineNumber() - 1); t.equal(events[0].columnNumber, 12); logger.useCallStack = false; logger.info('disabled'); t.equal(logger.useCallStack, false); t.equal(events[1].data[0], 'disabled'); t.equal(events[1].fileName, undefined); t.equal(events[1].lineNumber, undefined); t.equal(events[1].columnNumber, undefined); t.end(); }); batch.test( 'Once switch on/off useCallStack will apply all same category loggers', (t) => { const logger1 = new Logger('stack'); logger1.level = 'debug'; logger1.useCallStack = true; const logger2 = new Logger('stack'); logger2.level = 'debug'; logger1.info('hello world'); const callsite = callsites()[0]; t.equal(logger1.useCallStack, true); t.equal(events.length, 1); t.equal(events[0].data[0], 'hello world'); t.equal(events[0].fileName, callsite.getFileName()); t.equal(events[0].lineNumber, callsite.getLineNumber() - 1); t.equal(events[0].columnNumber, 15); // col of the '.' in logger1.info(...) logger2.info('hello world'); const callsite2 = callsites()[0]; t.equal(logger2.useCallStack, true); t.equal(events[1].data[0], 'hello world'); t.equal(events[1].fileName, callsite2.getFileName()); t.equal(events[1].lineNumber, callsite2.getLineNumber() - 1); t.equal(events[1].columnNumber, 15); // col of the '.' in logger1.info(...) logger1.useCallStack = false; logger2.info('hello world'); t.equal(logger2.useCallStack, false); t.equal(events[2].data[0], 'hello world'); t.equal(events[2].fileName, undefined); t.equal(events[2].lineNumber, undefined); t.equal(events[2].columnNumber, undefined); t.end(); } ); batch.test('parseCallStack function coverage', (t) => { const logger = new Logger('stack'); logger.useCallStack = true; let results; results = logger.parseCallStack(new Error()); t.ok(results); t.equal(messages.length, 0, 'should not have error'); results = logger.parseCallStack(''); t.notOk(results); t.equal(messages.length, 1, 'should have error'); results = logger.parseCallStack(new Error(), 100); t.equal(results, null); t.end(); }); batch.test('parseCallStack names extraction', (t) => { const logger = new Logger('stack'); logger.useCallStack = true; let results; const callStack1 = ' at Foo.bar [as baz] (repl:1:14)\n at ContextifyScript.Script.runInThisContext (vm.js:50:33)\n at REPLServer.defaultEval (repl.js:240:29)\n at bound (domain.js:301:14)\n at REPLServer.runBound [as eval] (domain.js:314:12)\n at REPLServer.onLine (repl.js:468:10)\n at emitOne (events.js:121:20)\n at REPLServer.emit (events.js:211:7)\n at REPLServer.Interface._onLine (readline.js:280:10)\n at REPLServer.Interface._line (readline.js:629:8)'; // eslint-disable-line max-len results = logger.parseCallStack({ stack: callStack1 }, 0); t.ok(results); t.equal(results.className, 'Foo'); t.equal(results.functionName, 'bar'); t.equal(results.functionAlias, 'baz'); t.equal(results.callerName, 'Foo.bar [as baz]'); const callStack2 = ' at bar [as baz] (repl:1:14)\n at ContextifyScript.Script.runInThisContext (vm.js:50:33)\n at REPLServer.defaultEval (repl.js:240:29)\n at bound (domain.js:301:14)\n at REPLServer.runBound [as eval] (domain.js:314:12)\n at REPLServer.onLine (repl.js:468:10)\n at emitOne (events.js:121:20)\n at REPLServer.emit (events.js:211:7)\n at REPLServer.Interface._onLine (readline.js:280:10)\n at REPLServer.Interface._line (readline.js:629:8)'; // eslint-disable-line max-len results = logger.parseCallStack({ stack: callStack2 }, 0); t.ok(results); t.equal(results.className, ''); t.equal(results.functionName, 'bar'); t.equal(results.functionAlias, 'baz'); t.equal(results.callerName, 'bar [as baz]'); const callStack3 = ' at bar (repl:1:14)\n at ContextifyScript.Script.runInThisContext (vm.js:50:33)\n at REPLServer.defaultEval (repl.js:240:29)\n at bound (domain.js:301:14)\n at REPLServer.runBound [as eval] (domain.js:314:12)\n at REPLServer.onLine (repl.js:468:10)\n at emitOne (events.js:121:20)\n at REPLServer.emit (events.js:211:7)\n at REPLServer.Interface._onLine (readline.js:280:10)\n at REPLServer.Interface._line (readline.js:629:8)'; // eslint-disable-line max-len results = logger.parseCallStack({ stack: callStack3 }, 0); t.ok(results); t.equal(results.className, ''); t.equal(results.functionName, 'bar'); t.equal(results.functionAlias, ''); t.equal(results.callerName, 'bar'); const callStack4 = ' at repl:1:14\n at ContextifyScript.Script.runInThisContext (vm.js:50:33)\n at REPLServer.defaultEval (repl.js:240:29)\n at bound (domain.js:301:14)\n at REPLServer.runBound [as eval] (domain.js:314:12)\n at REPLServer.onLine (repl.js:468:10)\n at emitOne (events.js:121:20)\n at REPLServer.emit (events.js:211:7)\n at REPLServer.Interface._onLine (readline.js:280:10)\n at REPLServer.Interface._line (readline.js:629:8)'; // eslint-disable-line max-len results = logger.parseCallStack({ stack: callStack4 }, 0); t.ok(results); t.equal(results.className, ''); t.equal(results.functionName, ''); t.equal(results.functionAlias, ''); t.equal(results.callerName, ''); const callStack5 = ' at Foo.bar (repl:1:14)\n at ContextifyScript.Script.runInThisContext (vm.js:50:33)\n at REPLServer.defaultEval (repl.js:240:29)\n at bound (domain.js:301:14)\n at REPLServer.runBound [as eval] (domain.js:314:12)\n at REPLServer.onLine (repl.js:468:10)\n at emitOne (events.js:121:20)\n at REPLServer.emit (events.js:211:7)\n at REPLServer.Interface._onLine (readline.js:280:10)\n at REPLServer.Interface._line (readline.js:629:8)'; // eslint-disable-line max-len results = logger.parseCallStack({ stack: callStack5 }, 0); t.ok(results); t.equal(results.className, 'Foo'); t.equal(results.functionName, 'bar'); t.equal(results.functionAlias, ''); t.equal(results.callerName, 'Foo.bar'); t.end(); }); batch.test('should correctly change the parseCallStack function', (t) => { const logger = new Logger('stack'); logger.level = 'debug'; logger.useCallStack = true; logger.info('test defaultParseCallStack'); const initialEvent = events.shift(); const parseFunction = function () { return { functionName: 'test function name', fileName: 'test file name', lineNumber: 15, columnNumber: 25, callStack: 'test callstack', }; }; logger.setParseCallStackFunction(parseFunction); t.equal(logger.parseCallStack, parseFunction); logger.info('test parseCallStack'); t.equal(events[0].functionName, 'test function name'); t.equal(events[0].fileName, 'test file name'); t.equal(events[0].lineNumber, 15); t.equal(events[0].columnNumber, 25); t.equal(events[0].callStack, 'test callstack'); events.shift(); logger.setParseCallStackFunction(undefined); logger.info('test restoredDefaultParseCallStack'); t.equal(events[0].functionName, initialEvent.functionName); t.equal(events[0].fileName, initialEvent.fileName); t.equal(events[0].columnNumber, initialEvent.columnNumber); t.throws(() => logger.setParseCallStackFunction('not a function')); t.end(); }); batch.test('should correctly change the stack levels to skip', (t) => { const logger = new Logger('stack'); logger.level = 'debug'; logger.useCallStack = true; t.equal( logger.callStackLinesToSkip, 0, 'initial callStackLinesToSkip changed' ); logger.info('get initial stack'); const initialEvent = events.shift(); const newStackSkip = 1; logger.callStackLinesToSkip = newStackSkip; t.equal(logger.callStackLinesToSkip, newStackSkip); logger.info('test stack skip'); const event = events.shift(); t.not(event.functionName, initialEvent.functionName); t.not(event.fileName, initialEvent.fileName); t.equal( event.callStack, initialEvent.callStack.split('\n').slice(newStackSkip).join('\n') ); t.throws(() => { logger.callStackLinesToSkip = -1; }); t.throws(() => { logger.callStackLinesToSkip = '2'; }); t.end(); }); batch.test('should utilize the first Error data value', (t) => { const logger = new Logger('stack'); logger.level = 'debug'; logger.useCallStack = true; const error = new Error(); logger.info(error); const event = events.shift(); t.equal(event.error, error); logger.info(error); t.match(event, events.shift()); logger.callStackLinesToSkip = 1; logger.info(error); const event2 = events.shift(); t.equal(event2.callStack, event.callStack.split('\n').slice(1).join('\n')); logger.callStackLinesToSkip = 0; logger.info('hi', error); const event3 = events.shift(); t.equal(event3.callStack, event.callStack); t.equal(event3.error, error); logger.info('hi', error, new Error()); const event4 = events.shift(); t.equal(event4.callStack, event.callStack); t.equal(event4.error, error); t.end(); }); batch.test('creating/cloning of category', (t) => { const defaultLogger = new Logger('default'); defaultLogger.level = 'trace'; defaultLogger.useCallStack = true; t.test( 'category should be cloned from parent/default if does not exist', (assert) => { const originalLength = categories.size; const logger = new Logger('cheese1'); assert.equal( categories.size, originalLength + 1, 'category should be cloned' ); assert.equal( logger.level, levels.TRACE, 'should inherit level=TRACE from default-category' ); assert.equal( logger.useCallStack, true, 'should inherit useCallStack=true from default-category' ); assert.end(); } ); t.test( 'changing level should not impact default-category or useCallStack', (assert) => { const logger = new Logger('cheese2'); logger.level = 'debug'; assert.equal( logger.level, levels.DEBUG, 'should be changed to level=DEBUG' ); assert.equal( defaultLogger.level, levels.TRACE, 'default-category should remain as level=TRACE' ); assert.equal( logger.useCallStack, true, 'should remain as useCallStack=true' ); assert.equal( defaultLogger.useCallStack, true, 'default-category should remain as useCallStack=true' ); assert.end(); } ); t.test( 'changing useCallStack should not impact default-category or level', (assert) => { const logger = new Logger('cheese3'); logger.useCallStack = false; assert.equal( logger.useCallStack, false, 'should be changed to useCallStack=false' ); assert.equal( defaultLogger.useCallStack, true, 'default-category should remain as useCallStack=true' ); assert.equal( logger.level, levels.TRACE, 'should remain as level=TRACE' ); assert.equal( defaultLogger.level, levels.TRACE, 'default-category should remain as level=TRACE' ); assert.end(); } ); t.end(); }); batch.end(); });
const { test } = require('tap'); const debug = require('debug')('log4js:test.logger'); const sandbox = require('@log4js-node/sandboxed-module'); const callsites = require('callsites'); const levels = require('../../lib/levels'); const categories = require('../../lib/categories'); /** @type {import('../../types/log4js').LoggingEvent[]} */ const events = []; /** @type {string[]} */ const messages = []; /** * @typedef {import('../../types/log4js').Logger} LoggerClass */ /** @type {{new (): LoggerClass}} */ const Logger = sandbox.require('../../lib/logger', { requires: { './levels': levels, './categories': categories, './clustering': { isMaster: () => true, onlyOnMaster: (fn) => fn(), send: (evt) => { debug('fake clustering got event:', evt); events.push(evt); }, }, }, globals: { console: { ...console, error(msg) { messages.push(msg); }, }, }, }); const testConfig = { level: levels.TRACE, }; test('../../lib/logger', (batch) => { batch.beforeEach((done) => { events.length = 0; testConfig.level = levels.TRACE; if (typeof done === 'function') { done(); } }); batch.test('constructor with no parameters', (t) => { t.throws(() => new Logger(), new Error('No category provided.')); t.end(); }); batch.test('constructor with category', (t) => { const logger = new Logger('cheese'); t.equal(logger.category, 'cheese', 'should use category'); t.equal(logger.level, levels.OFF, 'should use OFF log level'); t.end(); }); batch.test('set level should delegate', (t) => { const logger = new Logger('cheese'); logger.level = 'debug'; t.equal(logger.category, 'cheese', 'should use category'); t.equal(logger.level, levels.DEBUG, 'should use level'); t.end(); }); batch.test('isLevelEnabled', (t) => { const logger = new Logger('cheese'); const functions = [ 'isTraceEnabled', 'isDebugEnabled', 'isInfoEnabled', 'isWarnEnabled', 'isErrorEnabled', 'isFatalEnabled', ]; t.test( 'should provide a level enabled function for all levels', (subtest) => { subtest.plan(functions.length); functions.forEach((fn) => { subtest.type(logger[fn], 'function'); }); } ); logger.level = 'INFO'; t.notOk(logger.isTraceEnabled()); t.notOk(logger.isDebugEnabled()); t.ok(logger.isInfoEnabled()); t.ok(logger.isWarnEnabled()); t.ok(logger.isErrorEnabled()); t.ok(logger.isFatalEnabled()); t.end(); }); batch.test('should send log events to dispatch function', (t) => { const logger = new Logger('cheese'); logger.level = 'debug'; logger.debug('Event 1'); logger.debug('Event 2'); logger.debug('Event 3'); t.equal(events.length, 3); t.equal(events[0].data[0], 'Event 1'); t.equal(events[1].data[0], 'Event 2'); t.equal(events[2].data[0], 'Event 3'); t.end(); }); batch.test('should add context values to every event', (t) => { const logger = new Logger('fromage'); logger.level = 'debug'; logger.debug('Event 1'); logger.addContext('cheese', 'edam'); logger.debug('Event 2'); logger.debug('Event 3'); logger.addContext('biscuits', 'timtam'); logger.debug('Event 4'); logger.removeContext('cheese'); logger.debug('Event 5'); logger.clearContext(); logger.debug('Event 6'); t.equal(events.length, 6); t.same(events[0].context, {}); t.same(events[1].context, { cheese: 'edam' }); t.same(events[2].context, { cheese: 'edam' }); t.same(events[3].context, { cheese: 'edam', biscuits: 'timtam' }); t.same(events[4].context, { biscuits: 'timtam' }); t.same(events[5].context, {}); t.end(); }); batch.test('should not break when log data has no toString', (t) => { const logger = new Logger('thing'); logger.level = 'debug'; logger.info('Just testing ', Object.create(null)); t.equal(events.length, 1); t.end(); }); batch.test( 'default should disable useCallStack unless manual enable', (t) => { const logger = new Logger('stack'); logger.level = 'debug'; t.equal(logger.useCallStack, false); logger.debug('test no callStack'); let event = events.shift(); t.notMatch(event, { functionName: String }); t.notMatch(event, { fileName: String }); t.notMatch(event, { lineNumber: Number }); t.notMatch(event, { columnNumber: Number }); t.notMatch(event, { callStack: String }); logger.useCallStack = false; t.equal(logger.useCallStack, false); logger.useCallStack = 0; t.equal(logger.useCallStack, false); logger.useCallStack = ''; t.equal(logger.useCallStack, false); logger.useCallStack = null; t.equal(logger.useCallStack, false); logger.useCallStack = undefined; t.equal(logger.useCallStack, false); logger.useCallStack = 'true'; t.equal(logger.useCallStack, false); logger.useCallStack = true; t.equal(logger.useCallStack, true); logger.debug('test with callStack'); event = events.shift(); t.match(event, { functionName: String, fileName: String, lineNumber: Number, columnNumber: Number, callStack: String, }); t.end(); } ); batch.test('should correctly switch on/off useCallStack', (t) => { const logger = new Logger('stack'); logger.level = 'debug'; logger.useCallStack = true; t.equal(logger.useCallStack, true); logger.info('hello world'); const callsite = callsites()[0]; t.equal(events.length, 1); t.equal(events[0].data[0], 'hello world'); t.equal(events[0].fileName, callsite.getFileName()); t.equal(events[0].lineNumber, callsite.getLineNumber() - 1); t.equal(events[0].columnNumber, 12); logger.useCallStack = false; logger.info('disabled'); t.equal(logger.useCallStack, false); t.equal(events[1].data[0], 'disabled'); t.equal(events[1].fileName, undefined); t.equal(events[1].lineNumber, undefined); t.equal(events[1].columnNumber, undefined); t.end(); }); batch.test( 'Once switch on/off useCallStack will apply all same category loggers', (t) => { const logger1 = new Logger('stack'); logger1.level = 'debug'; logger1.useCallStack = true; const logger2 = new Logger('stack'); logger2.level = 'debug'; logger1.info('hello world'); const callsite = callsites()[0]; t.equal(logger1.useCallStack, true); t.equal(events.length, 1); t.equal(events[0].data[0], 'hello world'); t.equal(events[0].fileName, callsite.getFileName()); t.equal(events[0].lineNumber, callsite.getLineNumber() - 1); t.equal(events[0].columnNumber, 15); // col of the '.' in logger1.info(...) logger2.info('hello world'); const callsite2 = callsites()[0]; t.equal(logger2.useCallStack, true); t.equal(events[1].data[0], 'hello world'); t.equal(events[1].fileName, callsite2.getFileName()); t.equal(events[1].lineNumber, callsite2.getLineNumber() - 1); t.equal(events[1].columnNumber, 15); // col of the '.' in logger1.info(...) logger1.useCallStack = false; logger2.info('hello world'); t.equal(logger2.useCallStack, false); t.equal(events[2].data[0], 'hello world'); t.equal(events[2].fileName, undefined); t.equal(events[2].lineNumber, undefined); t.equal(events[2].columnNumber, undefined); t.end(); } ); batch.test('parseCallStack function coverage', (t) => { const logger = new Logger('stack'); logger.useCallStack = true; let results; results = logger.parseCallStack(new Error()); t.ok(results); t.equal(messages.length, 0, 'should not have error'); results = logger.parseCallStack(''); t.notOk(results); t.equal(messages.length, 1, 'should have error'); results = logger.parseCallStack(new Error(), 100); t.equal(results, null); t.end(); }); batch.test('parseCallStack names extraction', (t) => { const logger = new Logger('stack'); logger.useCallStack = true; let results; const callStack1 = ' at Foo.bar [as baz] (repl:1:14)\n at ContextifyScript.Script.runInThisContext (vm.js:50:33)\n at REPLServer.defaultEval (repl.js:240:29)\n at bound (domain.js:301:14)\n at REPLServer.runBound [as eval] (domain.js:314:12)\n at REPLServer.onLine (repl.js:468:10)\n at emitOne (events.js:121:20)\n at REPLServer.emit (events.js:211:7)\n at REPLServer.Interface._onLine (readline.js:280:10)\n at REPLServer.Interface._line (readline.js:629:8)'; // eslint-disable-line max-len results = logger.parseCallStack({ stack: callStack1 }, 0); t.ok(results); t.equal(results.className, 'Foo'); t.equal(results.functionName, 'bar'); t.equal(results.functionAlias, 'baz'); t.equal(results.callerName, 'Foo.bar [as baz]'); const callStack2 = ' at bar [as baz] (repl:1:14)\n at ContextifyScript.Script.runInThisContext (vm.js:50:33)\n at REPLServer.defaultEval (repl.js:240:29)\n at bound (domain.js:301:14)\n at REPLServer.runBound [as eval] (domain.js:314:12)\n at REPLServer.onLine (repl.js:468:10)\n at emitOne (events.js:121:20)\n at REPLServer.emit (events.js:211:7)\n at REPLServer.Interface._onLine (readline.js:280:10)\n at REPLServer.Interface._line (readline.js:629:8)'; // eslint-disable-line max-len results = logger.parseCallStack({ stack: callStack2 }, 0); t.ok(results); t.equal(results.className, ''); t.equal(results.functionName, 'bar'); t.equal(results.functionAlias, 'baz'); t.equal(results.callerName, 'bar [as baz]'); const callStack3 = ' at bar (repl:1:14)\n at ContextifyScript.Script.runInThisContext (vm.js:50:33)\n at REPLServer.defaultEval (repl.js:240:29)\n at bound (domain.js:301:14)\n at REPLServer.runBound [as eval] (domain.js:314:12)\n at REPLServer.onLine (repl.js:468:10)\n at emitOne (events.js:121:20)\n at REPLServer.emit (events.js:211:7)\n at REPLServer.Interface._onLine (readline.js:280:10)\n at REPLServer.Interface._line (readline.js:629:8)'; // eslint-disable-line max-len results = logger.parseCallStack({ stack: callStack3 }, 0); t.ok(results); t.equal(results.className, ''); t.equal(results.functionName, 'bar'); t.equal(results.functionAlias, ''); t.equal(results.callerName, 'bar'); const callStack4 = ' at repl:1:14\n at ContextifyScript.Script.runInThisContext (vm.js:50:33)\n at REPLServer.defaultEval (repl.js:240:29)\n at bound (domain.js:301:14)\n at REPLServer.runBound [as eval] (domain.js:314:12)\n at REPLServer.onLine (repl.js:468:10)\n at emitOne (events.js:121:20)\n at REPLServer.emit (events.js:211:7)\n at REPLServer.Interface._onLine (readline.js:280:10)\n at REPLServer.Interface._line (readline.js:629:8)'; // eslint-disable-line max-len results = logger.parseCallStack({ stack: callStack4 }, 0); t.ok(results); t.equal(results.className, ''); t.equal(results.functionName, ''); t.equal(results.functionAlias, ''); t.equal(results.callerName, ''); const callStack5 = ' at Foo.bar (repl:1:14)\n at ContextifyScript.Script.runInThisContext (vm.js:50:33)\n at REPLServer.defaultEval (repl.js:240:29)\n at bound (domain.js:301:14)\n at REPLServer.runBound [as eval] (domain.js:314:12)\n at REPLServer.onLine (repl.js:468:10)\n at emitOne (events.js:121:20)\n at REPLServer.emit (events.js:211:7)\n at REPLServer.Interface._onLine (readline.js:280:10)\n at REPLServer.Interface._line (readline.js:629:8)'; // eslint-disable-line max-len results = logger.parseCallStack({ stack: callStack5 }, 0); t.ok(results); t.equal(results.className, 'Foo'); t.equal(results.functionName, 'bar'); t.equal(results.functionAlias, ''); t.equal(results.callerName, 'Foo.bar'); t.end(); }); batch.test('should correctly change the parseCallStack function', (t) => { const logger = new Logger('stack'); logger.level = 'debug'; logger.useCallStack = true; logger.info('test defaultParseCallStack'); const initialEvent = events.shift(); const parseFunction = function () { return { functionName: 'test function name', fileName: 'test file name', lineNumber: 15, columnNumber: 25, callStack: 'test callstack', }; }; logger.setParseCallStackFunction(parseFunction); t.equal(logger.parseCallStack, parseFunction); logger.info('test parseCallStack'); t.equal(events[0].functionName, 'test function name'); t.equal(events[0].fileName, 'test file name'); t.equal(events[0].lineNumber, 15); t.equal(events[0].columnNumber, 25); t.equal(events[0].callStack, 'test callstack'); events.shift(); logger.setParseCallStackFunction(undefined); logger.info('test restoredDefaultParseCallStack'); t.equal(events[0].functionName, initialEvent.functionName); t.equal(events[0].fileName, initialEvent.fileName); t.equal(events[0].columnNumber, initialEvent.columnNumber); t.throws(() => logger.setParseCallStackFunction('not a function')); t.end(); }); batch.test('should correctly change the stack levels to skip', (t) => { const logger = new Logger('stack'); logger.level = 'debug'; logger.useCallStack = true; t.equal( logger.callStackLinesToSkip, 0, 'initial callStackLinesToSkip changed' ); logger.info('get initial stack'); const initialEvent = events.shift(); const newStackSkip = 1; logger.callStackLinesToSkip = newStackSkip; t.equal(logger.callStackLinesToSkip, newStackSkip); logger.info('test stack skip'); const event = events.shift(); t.not(event.functionName, initialEvent.functionName); t.not(event.fileName, initialEvent.fileName); t.equal( event.callStack, initialEvent.callStack.split('\n').slice(newStackSkip).join('\n') ); t.throws(() => { logger.callStackLinesToSkip = -1; }); t.throws(() => { logger.callStackLinesToSkip = '2'; }); t.end(); }); batch.test('should utilize the first Error data value', (t) => { const logger = new Logger('stack'); logger.level = 'debug'; logger.useCallStack = true; const error = new Error(); logger.info(error); const event = events.shift(); t.equal(event.error, error); logger.info(error); t.match(event, events.shift()); logger.callStackLinesToSkip = 1; logger.info(error); const event2 = events.shift(); t.equal(event2.callStack, event.callStack.split('\n').slice(1).join('\n')); logger.callStackLinesToSkip = 0; logger.info('hi', error); const event3 = events.shift(); t.equal(event3.callStack, event.callStack); t.equal(event3.error, error); logger.info('hi', error, new Error()); const event4 = events.shift(); t.equal(event4.callStack, event.callStack); t.equal(event4.error, error); t.end(); }); batch.test('creating/cloning of category', (t) => { const defaultLogger = new Logger('default'); defaultLogger.level = 'trace'; defaultLogger.useCallStack = true; t.test( 'category should be cloned from parent/default if does not exist', (assert) => { const originalLength = categories.size; const logger = new Logger('cheese1'); assert.equal( categories.size, originalLength + 1, 'category should be cloned' ); assert.equal( logger.level, levels.TRACE, 'should inherit level=TRACE from default-category' ); assert.equal( logger.useCallStack, true, 'should inherit useCallStack=true from default-category' ); assert.end(); } ); t.test( 'changing level should not impact default-category or useCallStack', (assert) => { const logger = new Logger('cheese2'); logger.level = 'debug'; assert.equal( logger.level, levels.DEBUG, 'should be changed to level=DEBUG' ); assert.equal( defaultLogger.level, levels.TRACE, 'default-category should remain as level=TRACE' ); assert.equal( logger.useCallStack, true, 'should remain as useCallStack=true' ); assert.equal( defaultLogger.useCallStack, true, 'default-category should remain as useCallStack=true' ); assert.end(); } ); t.test( 'changing useCallStack should not impact default-category or level', (assert) => { const logger = new Logger('cheese3'); logger.useCallStack = false; assert.equal( logger.useCallStack, false, 'should be changed to useCallStack=false' ); assert.equal( defaultLogger.useCallStack, true, 'default-category should remain as useCallStack=true' ); assert.equal( logger.level, levels.TRACE, 'should remain as level=TRACE' ); assert.equal( defaultLogger.level, levels.TRACE, 'default-category should remain as level=TRACE' ); assert.end(); } ); t.end(); }); batch.end(); });
-1
log4js-node/log4js-node
1,332
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`
Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
lamweili
"2022-10-01T10:52:35Z"
"2022-10-01T10:57:05Z"
916eef11f1d2aa2f32765f956f1f674745feb8b6
570ef530dc02d3e843a5421cb015bb8fadfe0b41
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`. Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
./test/tap/connect-context-test.js
/* eslint max-classes-per-file: ["error", 2] */ const { test } = require('tap'); const EE = require('events').EventEmitter; const levels = require('../../lib/levels'); class MockLogger { constructor() { this.level = levels.TRACE; this.context = {}; this.contexts = []; } log() { this.contexts.push(Object.assign({}, this.context)); // eslint-disable-line prefer-object-spread } isLevelEnabled(level) { return level.isGreaterThanOrEqualTo(this.level); } addContext(key, value) { this.context[key] = value; } removeContext(key) { delete this.context[key]; } } function MockRequest(remoteAddr, method, originalUrl) { this.socket = { remoteAddress: remoteAddr }; this.originalUrl = originalUrl; this.method = method; this.httpVersionMajor = '5'; this.httpVersionMinor = '0'; this.headers = {}; } class MockResponse extends EE { constructor(code) { super(); this.statusCode = code; this.cachedHeaders = {}; } end() { this.emit('finish'); } setHeader(key, value) { this.cachedHeaders[key.toLowerCase()] = value; } getHeader(key) { return this.cachedHeaders[key.toLowerCase()]; } writeHead(code /* , headers */) { this.statusCode = code; } } test('log4js connect logger', (batch) => { const clm = require('../../lib/connect-logger'); batch.test('with context config', (t) => { const ml = new MockLogger(); const cl = clm(ml, { context: true }); t.beforeEach((done) => { ml.contexts = []; if (typeof done === 'function') { done(); } }); t.test('response should be included in context', (assert) => { const { contexts } = ml; const req = new MockRequest( 'my.remote.addr', 'GET', 'http://url/hoge.png' ); // not gif const res = new MockResponse(200); cl(req, res, () => {}); res.end('chunk', 'encoding'); assert.type(contexts, 'Array'); assert.equal(contexts.length, 1); assert.type(contexts[0].res, MockResponse); assert.end(); }); t.end(); }); batch.test('without context config', (t) => { const ml = new MockLogger(); const cl = clm(ml, {}); t.beforeEach((done) => { ml.contexts = []; if (typeof done === 'function') { done(); } }); t.test('response should not be included in context', (assert) => { const { contexts } = ml; const req = new MockRequest( 'my.remote.addr', 'GET', 'http://url/hoge.png' ); // not gif const res = new MockResponse(200); cl(req, res, () => {}); res.end('chunk', 'encoding'); assert.type(contexts, 'Array'); assert.equal(contexts.length, 1); assert.type(contexts[0].res, undefined); assert.end(); }); t.end(); }); batch.end(); });
/* eslint max-classes-per-file: ["error", 2] */ const { test } = require('tap'); const EE = require('events').EventEmitter; const levels = require('../../lib/levels'); class MockLogger { constructor() { this.level = levels.TRACE; this.context = {}; this.contexts = []; } log() { this.contexts.push(Object.assign({}, this.context)); // eslint-disable-line prefer-object-spread } isLevelEnabled(level) { return level.isGreaterThanOrEqualTo(this.level); } addContext(key, value) { this.context[key] = value; } removeContext(key) { delete this.context[key]; } } function MockRequest(remoteAddr, method, originalUrl) { this.socket = { remoteAddress: remoteAddr }; this.originalUrl = originalUrl; this.method = method; this.httpVersionMajor = '5'; this.httpVersionMinor = '0'; this.headers = {}; } class MockResponse extends EE { constructor(code) { super(); this.statusCode = code; this.cachedHeaders = {}; } end() { this.emit('finish'); } setHeader(key, value) { this.cachedHeaders[key.toLowerCase()] = value; } getHeader(key) { return this.cachedHeaders[key.toLowerCase()]; } writeHead(code /* , headers */) { this.statusCode = code; } } test('log4js connect logger', (batch) => { const clm = require('../../lib/connect-logger'); batch.test('with context config', (t) => { const ml = new MockLogger(); const cl = clm(ml, { context: true }); t.beforeEach((done) => { ml.contexts = []; if (typeof done === 'function') { done(); } }); t.test('response should be included in context', (assert) => { const { contexts } = ml; const req = new MockRequest( 'my.remote.addr', 'GET', 'http://url/hoge.png' ); // not gif const res = new MockResponse(200); cl(req, res, () => {}); res.end('chunk', 'encoding'); assert.type(contexts, 'Array'); assert.equal(contexts.length, 1); assert.type(contexts[0].res, MockResponse); assert.end(); }); t.end(); }); batch.test('without context config', (t) => { const ml = new MockLogger(); const cl = clm(ml, {}); t.beforeEach((done) => { ml.contexts = []; if (typeof done === 'function') { done(); } }); t.test('response should not be included in context', (assert) => { const { contexts } = ml; const req = new MockRequest( 'my.remote.addr', 'GET', 'http://url/hoge.png' ); // not gif const res = new MockResponse(200); cl(req, res, () => {}); res.end('chunk', 'encoding'); assert.type(contexts, 'Array'); assert.equal(contexts.length, 1); assert.type(contexts[0].res, undefined); assert.end(); }); t.end(); }); batch.end(); });
-1
log4js-node/log4js-node
1,332
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`
Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
lamweili
"2022-10-01T10:52:35Z"
"2022-10-01T10:57:05Z"
916eef11f1d2aa2f32765f956f1f674745feb8b6
570ef530dc02d3e843a5421cb015bb8fadfe0b41
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`. Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
./lib/appenders/dateFile.js
const streams = require('streamroller'); const os = require('os'); const eol = os.EOL; function openTheStream(filename, pattern, options) { const stream = new streams.DateRollingFileStream(filename, pattern, options); stream.on('error', (err) => { // eslint-disable-next-line no-console console.error( 'log4js.dateFileAppender - Writing to file %s, error happened ', filename, err ); }); stream.on('drain', () => { process.emit('log4js:pause', false); }); return stream; } /** * File appender that rolls files according to a date pattern. * @param filename base filename. * @param pattern the format that will be added to the end of filename when rolling, * also used to check when to roll files - defaults to '.yyyy-MM-dd' * @param layout layout function for log messages - defaults to basicLayout * @param options - options to be passed to the underlying stream * @param timezoneOffset - optional timezone offset in minutes (default system local) */ function appender(filename, pattern, layout, options, timezoneOffset) { // the options for file appender use maxLogSize, but the docs say any file appender // options should work for dateFile as well. options.maxSize = options.maxLogSize; const writer = openTheStream(filename, pattern, options); const app = function (logEvent) { if (!writer.writable) { return; } if (!writer.write(layout(logEvent, timezoneOffset) + eol, 'utf8')) { process.emit('log4js:pause', true); } }; app.shutdown = function (complete) { writer.end('', 'utf-8', complete); }; return app; } function configure(config, layouts) { let layout = layouts.basicLayout; if (config.layout) { layout = layouts.layout(config.layout.type, config.layout); } if (!config.alwaysIncludePattern) { config.alwaysIncludePattern = false; } // security default (instead of relying on streamroller default) config.mode = config.mode || 0o600; return appender( config.filename, config.pattern, layout, config, config.timezoneOffset ); } module.exports.configure = configure;
const streams = require('streamroller'); const os = require('os'); const eol = os.EOL; function openTheStream(filename, pattern, options) { const stream = new streams.DateRollingFileStream(filename, pattern, options); stream.on('error', (err) => { // eslint-disable-next-line no-console console.error( 'log4js.dateFileAppender - Writing to file %s, error happened ', filename, err ); }); stream.on('drain', () => { process.emit('log4js:pause', false); }); return stream; } /** * File appender that rolls files according to a date pattern. * @param filename base filename. * @param pattern the format that will be added to the end of filename when rolling, * also used to check when to roll files - defaults to '.yyyy-MM-dd' * @param layout layout function for log messages - defaults to basicLayout * @param options - options to be passed to the underlying stream * @param timezoneOffset - optional timezone offset in minutes (default system local) */ function appender(filename, pattern, layout, options, timezoneOffset) { // the options for file appender use maxLogSize, but the docs say any file appender // options should work for dateFile as well. options.maxSize = options.maxLogSize; const writer = openTheStream(filename, pattern, options); const app = function (logEvent) { if (!writer.writable) { return; } if (!writer.write(layout(logEvent, timezoneOffset) + eol, 'utf8')) { process.emit('log4js:pause', true); } }; app.shutdown = function (complete) { writer.end('', 'utf-8', complete); }; return app; } function configure(config, layouts) { let layout = layouts.basicLayout; if (config.layout) { layout = layouts.layout(config.layout.type, config.layout); } if (!config.alwaysIncludePattern) { config.alwaysIncludePattern = false; } // security default (instead of relying on streamroller default) config.mode = config.mode || 0o600; return appender( config.filename, config.pattern, layout, config, config.timezoneOffset ); } module.exports.configure = configure;
-1
log4js-node/log4js-node
1,332
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`
Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
lamweili
"2022-10-01T10:52:35Z"
"2022-10-01T10:57:05Z"
916eef11f1d2aa2f32765f956f1f674745feb8b6
570ef530dc02d3e843a5421cb015bb8fadfe0b41
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`. Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
./lib/appenders/stdout.js
function stdoutAppender(layout, timezoneOffset) { return (loggingEvent) => { process.stdout.write(`${layout(loggingEvent, timezoneOffset)}\n`); }; } function configure(config, layouts) { let layout = layouts.colouredLayout; if (config.layout) { layout = layouts.layout(config.layout.type, config.layout); } return stdoutAppender(layout, config.timezoneOffset); } exports.configure = configure;
function stdoutAppender(layout, timezoneOffset) { return (loggingEvent) => { process.stdout.write(`${layout(loggingEvent, timezoneOffset)}\n`); }; } function configure(config, layouts) { let layout = layouts.colouredLayout; if (config.layout) { layout = layouts.layout(config.layout.type, config.layout); } return stdoutAppender(layout, config.timezoneOffset); } exports.configure = configure;
-1
log4js-node/log4js-node
1,332
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`
Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
lamweili
"2022-10-01T10:52:35Z"
"2022-10-01T10:57:05Z"
916eef11f1d2aa2f32765f956f1f674745feb8b6
570ef530dc02d3e843a5421cb015bb8fadfe0b41
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`. Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
./examples/log-rolling.js
const log4js = require('../lib/log4js'); log4js.configure({ appenders: { console: { type: 'console', }, file: { type: 'file', filename: 'tmp-test.log', maxLogSize: 1024, backups: 3, }, }, categories: { default: { appenders: ['console', 'file'], level: 'info' }, }, }); const log = log4js.getLogger('test'); function doTheLogging(x) { log.info('Logging something %d', x); } let i = 0; for (; i < 5000; i += 1) { doTheLogging(i); }
const log4js = require('../lib/log4js'); log4js.configure({ appenders: { console: { type: 'console', }, file: { type: 'file', filename: 'tmp-test.log', maxLogSize: 1024, backups: 3, }, }, categories: { default: { appenders: ['console', 'file'], level: 'info' }, }, }); const log = log4js.getLogger('test'); function doTheLogging(x) { log.info('Logging something %d', x); } let i = 0; for (; i < 5000; i += 1) { doTheLogging(i); }
-1
log4js-node/log4js-node
1,332
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`
Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
lamweili
"2022-10-01T10:52:35Z"
"2022-10-01T10:57:05Z"
916eef11f1d2aa2f32765f956f1f674745feb8b6
570ef530dc02d3e843a5421cb015bb8fadfe0b41
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`. Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
./test/tap/stacktraces-test.js
const { test } = require('tap'); test('Stacktraces from errors in different VM context', (t) => { const log4js = require('../../lib/log4js'); const recorder = require('../../lib/appenders/recording'); const layout = require('../../lib/layouts').basicLayout; const vm = require('vm'); log4js.configure({ appenders: { vcr: { type: 'recording' } }, categories: { default: { appenders: ['vcr'], level: 'debug' } }, }); const logger = log4js.getLogger(); try { // Access not defined variable. vm.runInNewContext('myVar();', {}, 'myfile.js'); } catch (e) { // Expect to have a stack trace printed. logger.error(e); } const events = recorder.replay(); // recording appender events do not go through layouts, so let's do it const output = layout(events[0]); t.match(output, 'stacktraces-test.js'); t.end(); });
const { test } = require('tap'); test('Stacktraces from errors in different VM context', (t) => { const log4js = require('../../lib/log4js'); const recorder = require('../../lib/appenders/recording'); const layout = require('../../lib/layouts').basicLayout; const vm = require('vm'); log4js.configure({ appenders: { vcr: { type: 'recording' } }, categories: { default: { appenders: ['vcr'], level: 'debug' } }, }); const logger = log4js.getLogger(); try { // Access not defined variable. vm.runInNewContext('myVar();', {}, 'myfile.js'); } catch (e) { // Expect to have a stack trace printed. logger.error(e); } const events = recorder.replay(); // recording appender events do not go through layouts, so let's do it const output = layout(events[0]); t.match(output, 'stacktraces-test.js'); t.end(); });
-1
log4js-node/log4js-node
1,332
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`
Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
lamweili
"2022-10-01T10:52:35Z"
"2022-10-01T10:57:05Z"
916eef11f1d2aa2f32765f956f1f674745feb8b6
570ef530dc02d3e843a5421cb015bb8fadfe0b41
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`. Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
./test/tap/levels-test.js
const { test } = require('tap'); const levels = require('../../lib/levels'); function assertThat(assert, level) { function assertForEach(assertion, testFn, otherLevels) { otherLevels.forEach((other) => { assertion.call(assert, testFn.call(level, other)); }); } return { isLessThanOrEqualTo(lvls) { assertForEach(assert.ok, level.isLessThanOrEqualTo, lvls); }, isNotLessThanOrEqualTo(lvls) { assertForEach(assert.notOk, level.isLessThanOrEqualTo, lvls); }, isGreaterThanOrEqualTo(lvls) { assertForEach(assert.ok, level.isGreaterThanOrEqualTo, lvls); }, isNotGreaterThanOrEqualTo(lvls) { assertForEach(assert.notOk, level.isGreaterThanOrEqualTo, lvls); }, isEqualTo(lvls) { assertForEach(assert.ok, level.isEqualTo, lvls); }, isNotEqualTo(lvls) { assertForEach(assert.notOk, level.isEqualTo, lvls); }, }; } test('levels', (batch) => { batch.test('values', (t) => { t.test('should define some levels', (assert) => { assert.ok(levels.ALL); assert.ok(levels.TRACE); assert.ok(levels.DEBUG); assert.ok(levels.INFO); assert.ok(levels.WARN); assert.ok(levels.ERROR); assert.ok(levels.FATAL); assert.ok(levels.MARK); assert.ok(levels.OFF); assert.end(); }); t.test('ALL', (assert) => { const all = levels.ALL; assertThat(assert, all).isLessThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, all).isNotGreaterThanOrEqualTo([ levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, all).isEqualTo([levels.getLevel('ALL')]); assertThat(assert, all).isNotEqualTo([ levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assert.end(); }); t.test('TRACE', (assert) => { const trace = levels.TRACE; assertThat(assert, trace).isLessThanOrEqualTo([ levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, trace).isNotLessThanOrEqualTo([levels.ALL]); assertThat(assert, trace).isGreaterThanOrEqualTo([ levels.ALL, levels.TRACE, ]); assertThat(assert, trace).isNotGreaterThanOrEqualTo([ levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, trace).isEqualTo([levels.getLevel('TRACE')]); assertThat(assert, trace).isNotEqualTo([ levels.ALL, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assert.end(); }); t.test('DEBUG', (assert) => { const debug = levels.DEBUG; assertThat(assert, debug).isLessThanOrEqualTo([ levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, debug).isNotLessThanOrEqualTo([ levels.ALL, levels.TRACE, ]); assertThat(assert, debug).isGreaterThanOrEqualTo([ levels.ALL, levels.TRACE, ]); assertThat(assert, debug).isNotGreaterThanOrEqualTo([ levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, debug).isEqualTo([levels.getLevel('DEBUG')]); assertThat(assert, debug).isNotEqualTo([ levels.ALL, levels.TRACE, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assert.end(); }); t.test('INFO', (assert) => { const info = levels.INFO; assertThat(assert, info).isLessThanOrEqualTo([ levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, info).isNotLessThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, ]); assertThat(assert, info).isGreaterThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, ]); assertThat(assert, info).isNotGreaterThanOrEqualTo([ levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, info).isEqualTo([levels.getLevel('INFO')]); assertThat(assert, info).isNotEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assert.end(); }); t.test('WARN', (assert) => { const warn = levels.WARN; assertThat(assert, warn).isLessThanOrEqualTo([ levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, warn).isNotLessThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, ]); assertThat(assert, warn).isGreaterThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, ]); assertThat(assert, warn).isNotGreaterThanOrEqualTo([ levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, warn).isEqualTo([levels.getLevel('WARN')]); assertThat(assert, warn).isNotEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.ERROR, levels.FATAL, levels.OFF, ]); assert.end(); }); t.test('ERROR', (assert) => { const error = levels.ERROR; assertThat(assert, error).isLessThanOrEqualTo([ levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, error).isNotLessThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, ]); assertThat(assert, error).isGreaterThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, ]); assertThat(assert, error).isNotGreaterThanOrEqualTo([ levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, error).isEqualTo([levels.getLevel('ERROR')]); assertThat(assert, error).isNotEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.FATAL, levels.MARK, levels.OFF, ]); assert.end(); }); t.test('FATAL', (assert) => { const fatal = levels.FATAL; assertThat(assert, fatal).isLessThanOrEqualTo([levels.MARK, levels.OFF]); assertThat(assert, fatal).isNotLessThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, ]); assertThat(assert, fatal).isGreaterThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, ]); assertThat(assert, fatal).isNotGreaterThanOrEqualTo([ levels.MARK, levels.OFF, ]); assertThat(assert, fatal).isEqualTo([levels.getLevel('FATAL')]); assertThat(assert, fatal).isNotEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.MARK, levels.OFF, ]); assert.end(); }); t.test('MARK', (assert) => { const mark = levels.MARK; assertThat(assert, mark).isLessThanOrEqualTo([levels.OFF]); assertThat(assert, mark).isNotLessThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.FATAL, levels.ERROR, ]); assertThat(assert, mark).isGreaterThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, ]); assertThat(assert, mark).isNotGreaterThanOrEqualTo([levels.OFF]); assertThat(assert, mark).isEqualTo([levels.getLevel('MARK')]); assertThat(assert, mark).isNotEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.OFF, ]); assert.end(); }); t.test('OFF', (assert) => { const off = levels.OFF; assertThat(assert, off).isNotLessThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, ]); assertThat(assert, off).isGreaterThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, ]); assertThat(assert, off).isEqualTo([levels.getLevel('OFF')]); assertThat(assert, off).isNotEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, ]); assert.end(); }); t.end(); }); batch.test('isGreaterThanOrEqualTo', (t) => { const info = levels.INFO; assertThat(t, info).isGreaterThanOrEqualTo(['all', 'trace', 'debug']); assertThat(t, info).isNotGreaterThanOrEqualTo([ 'warn', 'ERROR', 'Fatal', 'MARK', 'off', ]); t.end(); }); batch.test('isLessThanOrEqualTo', (t) => { const info = levels.INFO; assertThat(t, info).isNotLessThanOrEqualTo(['all', 'trace', 'debug']); assertThat(t, info).isLessThanOrEqualTo([ 'warn', 'ERROR', 'Fatal', 'MARK', 'off', ]); t.end(); }); batch.test('isEqualTo', (t) => { const info = levels.INFO; assertThat(t, info).isEqualTo(['info', 'INFO', 'iNfO']); t.end(); }); batch.test('getLevel', (t) => { t.equal(levels.getLevel('debug'), levels.DEBUG); t.equal(levels.getLevel('DEBUG'), levels.DEBUG); t.equal(levels.getLevel('DeBuG'), levels.DEBUG); t.notOk(levels.getLevel('cheese')); t.equal(levels.getLevel('cheese', levels.DEBUG), levels.DEBUG); t.equal( levels.getLevel({ level: 10000, levelStr: 'DEBUG', colour: 'cyan' }), levels.DEBUG ); t.end(); }); batch.end(); });
const { test } = require('tap'); const levels = require('../../lib/levels'); function assertThat(assert, level) { function assertForEach(assertion, testFn, otherLevels) { otherLevels.forEach((other) => { assertion.call(assert, testFn.call(level, other)); }); } return { isLessThanOrEqualTo(lvls) { assertForEach(assert.ok, level.isLessThanOrEqualTo, lvls); }, isNotLessThanOrEqualTo(lvls) { assertForEach(assert.notOk, level.isLessThanOrEqualTo, lvls); }, isGreaterThanOrEqualTo(lvls) { assertForEach(assert.ok, level.isGreaterThanOrEqualTo, lvls); }, isNotGreaterThanOrEqualTo(lvls) { assertForEach(assert.notOk, level.isGreaterThanOrEqualTo, lvls); }, isEqualTo(lvls) { assertForEach(assert.ok, level.isEqualTo, lvls); }, isNotEqualTo(lvls) { assertForEach(assert.notOk, level.isEqualTo, lvls); }, }; } test('levels', (batch) => { batch.test('values', (t) => { t.test('should define some levels', (assert) => { assert.ok(levels.ALL); assert.ok(levels.TRACE); assert.ok(levels.DEBUG); assert.ok(levels.INFO); assert.ok(levels.WARN); assert.ok(levels.ERROR); assert.ok(levels.FATAL); assert.ok(levels.MARK); assert.ok(levels.OFF); assert.end(); }); t.test('ALL', (assert) => { const all = levels.ALL; assertThat(assert, all).isLessThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, all).isNotGreaterThanOrEqualTo([ levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, all).isEqualTo([levels.getLevel('ALL')]); assertThat(assert, all).isNotEqualTo([ levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assert.end(); }); t.test('TRACE', (assert) => { const trace = levels.TRACE; assertThat(assert, trace).isLessThanOrEqualTo([ levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, trace).isNotLessThanOrEqualTo([levels.ALL]); assertThat(assert, trace).isGreaterThanOrEqualTo([ levels.ALL, levels.TRACE, ]); assertThat(assert, trace).isNotGreaterThanOrEqualTo([ levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, trace).isEqualTo([levels.getLevel('TRACE')]); assertThat(assert, trace).isNotEqualTo([ levels.ALL, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assert.end(); }); t.test('DEBUG', (assert) => { const debug = levels.DEBUG; assertThat(assert, debug).isLessThanOrEqualTo([ levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, debug).isNotLessThanOrEqualTo([ levels.ALL, levels.TRACE, ]); assertThat(assert, debug).isGreaterThanOrEqualTo([ levels.ALL, levels.TRACE, ]); assertThat(assert, debug).isNotGreaterThanOrEqualTo([ levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, debug).isEqualTo([levels.getLevel('DEBUG')]); assertThat(assert, debug).isNotEqualTo([ levels.ALL, levels.TRACE, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assert.end(); }); t.test('INFO', (assert) => { const info = levels.INFO; assertThat(assert, info).isLessThanOrEqualTo([ levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, info).isNotLessThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, ]); assertThat(assert, info).isGreaterThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, ]); assertThat(assert, info).isNotGreaterThanOrEqualTo([ levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, info).isEqualTo([levels.getLevel('INFO')]); assertThat(assert, info).isNotEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assert.end(); }); t.test('WARN', (assert) => { const warn = levels.WARN; assertThat(assert, warn).isLessThanOrEqualTo([ levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, warn).isNotLessThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, ]); assertThat(assert, warn).isGreaterThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, ]); assertThat(assert, warn).isNotGreaterThanOrEqualTo([ levels.ERROR, levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, warn).isEqualTo([levels.getLevel('WARN')]); assertThat(assert, warn).isNotEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.ERROR, levels.FATAL, levels.OFF, ]); assert.end(); }); t.test('ERROR', (assert) => { const error = levels.ERROR; assertThat(assert, error).isLessThanOrEqualTo([ levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, error).isNotLessThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, ]); assertThat(assert, error).isGreaterThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, ]); assertThat(assert, error).isNotGreaterThanOrEqualTo([ levels.FATAL, levels.MARK, levels.OFF, ]); assertThat(assert, error).isEqualTo([levels.getLevel('ERROR')]); assertThat(assert, error).isNotEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.FATAL, levels.MARK, levels.OFF, ]); assert.end(); }); t.test('FATAL', (assert) => { const fatal = levels.FATAL; assertThat(assert, fatal).isLessThanOrEqualTo([levels.MARK, levels.OFF]); assertThat(assert, fatal).isNotLessThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, ]); assertThat(assert, fatal).isGreaterThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, ]); assertThat(assert, fatal).isNotGreaterThanOrEqualTo([ levels.MARK, levels.OFF, ]); assertThat(assert, fatal).isEqualTo([levels.getLevel('FATAL')]); assertThat(assert, fatal).isNotEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.MARK, levels.OFF, ]); assert.end(); }); t.test('MARK', (assert) => { const mark = levels.MARK; assertThat(assert, mark).isLessThanOrEqualTo([levels.OFF]); assertThat(assert, mark).isNotLessThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.FATAL, levels.ERROR, ]); assertThat(assert, mark).isGreaterThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, ]); assertThat(assert, mark).isNotGreaterThanOrEqualTo([levels.OFF]); assertThat(assert, mark).isEqualTo([levels.getLevel('MARK')]); assertThat(assert, mark).isNotEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.OFF, ]); assert.end(); }); t.test('OFF', (assert) => { const off = levels.OFF; assertThat(assert, off).isNotLessThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, ]); assertThat(assert, off).isGreaterThanOrEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, ]); assertThat(assert, off).isEqualTo([levels.getLevel('OFF')]); assertThat(assert, off).isNotEqualTo([ levels.ALL, levels.TRACE, levels.DEBUG, levels.INFO, levels.WARN, levels.ERROR, levels.FATAL, levels.MARK, ]); assert.end(); }); t.end(); }); batch.test('isGreaterThanOrEqualTo', (t) => { const info = levels.INFO; assertThat(t, info).isGreaterThanOrEqualTo(['all', 'trace', 'debug']); assertThat(t, info).isNotGreaterThanOrEqualTo([ 'warn', 'ERROR', 'Fatal', 'MARK', 'off', ]); t.end(); }); batch.test('isLessThanOrEqualTo', (t) => { const info = levels.INFO; assertThat(t, info).isNotLessThanOrEqualTo(['all', 'trace', 'debug']); assertThat(t, info).isLessThanOrEqualTo([ 'warn', 'ERROR', 'Fatal', 'MARK', 'off', ]); t.end(); }); batch.test('isEqualTo', (t) => { const info = levels.INFO; assertThat(t, info).isEqualTo(['info', 'INFO', 'iNfO']); t.end(); }); batch.test('getLevel', (t) => { t.equal(levels.getLevel('debug'), levels.DEBUG); t.equal(levels.getLevel('DEBUG'), levels.DEBUG); t.equal(levels.getLevel('DeBuG'), levels.DEBUG); t.notOk(levels.getLevel('cheese')); t.equal(levels.getLevel('cheese', levels.DEBUG), levels.DEBUG); t.equal( levels.getLevel({ level: 10000, levelStr: 'DEBUG', colour: 'cyan' }), levels.DEBUG ); t.end(); }); batch.end(); });
-1
log4js-node/log4js-node
1,332
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`
Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
lamweili
"2022-10-01T10:52:35Z"
"2022-10-01T10:57:05Z"
916eef11f1d2aa2f32765f956f1f674745feb8b6
570ef530dc02d3e843a5421cb015bb8fadfe0b41
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`. Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
./test/tap/noLogFilter-test.js
const { test } = require('tap'); const log4js = require('../../lib/log4js'); const recording = require('../../lib/appenders/recording'); /** * test a simple regexp */ test('log4js noLogFilter', (batch) => { batch.beforeEach((done) => { recording.reset(); if (typeof done === 'function') { done(); } }); batch.test( 'appender should exclude events that match the regexp string', (t) => { log4js.configure({ appenders: { recorder: { type: 'recording' }, filtered: { type: 'noLogFilter', exclude: 'This.*not', appender: 'recorder', }, }, categories: { default: { appenders: ['filtered'], level: 'DEBUG' } }, }); const logger = log4js.getLogger(); logger.debug('This should not get logged'); logger.debug('This should get logged'); logger.debug( 'Another case that not match the regex, so it should get logged' ); const logEvents = recording.replay(); t.equal(logEvents.length, 2); t.equal(logEvents[0].data[0], 'This should get logged'); t.equal( logEvents[1].data[0], 'Another case that not match the regex, so it should get logged' ); t.end(); } ); /** * test an array of regexp */ batch.test( 'appender should exclude events that match the regexp string contained in the array', (t) => { log4js.configure({ appenders: { recorder: { type: 'recording' }, filtered: { type: 'noLogFilter', exclude: ['This.*not', 'instead'], appender: 'recorder', }, }, categories: { default: { appenders: ['filtered'], level: 'DEBUG' } }, }); const logger = log4js.getLogger(); logger.debug('This should not get logged'); logger.debug('This should get logged'); logger.debug( 'Another case that not match the regex, so it should get logged' ); logger.debug('This case instead it should get logged'); logger.debug('The last that should get logged'); const logEvents = recording.replay(); t.equal(logEvents.length, 3); t.equal(logEvents[0].data[0], 'This should get logged'); t.equal( logEvents[1].data[0], 'Another case that not match the regex, so it should get logged' ); t.equal(logEvents[2].data[0], 'The last that should get logged'); t.end(); } ); /** * test case insentitive regexp */ batch.test( 'appender should evaluate the regexp using incase sentitive option', (t) => { log4js.configure({ appenders: { recorder: { type: 'recording' }, filtered: { type: 'noLogFilter', exclude: ['NOT', 'eX.*de'], appender: 'recorder', }, }, categories: { default: { appenders: ['filtered'], level: 'DEBUG' } }, }); const logger = log4js.getLogger(); logger.debug('This should not get logged'); logger.debug('This should get logged'); logger.debug('Exclude this string'); logger.debug('Include this string'); const logEvents = recording.replay(); t.equal(logEvents.length, 2); t.equal(logEvents[0].data[0], 'This should get logged'); t.equal(logEvents[1].data[0], 'Include this string'); t.end(); } ); /** * test empty string or null regexp */ batch.test( 'appender should skip the match in case of empty or null regexp', (t) => { log4js.configure({ appenders: { recorder: { type: 'recording' }, filtered: { type: 'noLogFilter', exclude: ['', null, undefined], appender: 'recorder', }, }, categories: { default: { appenders: ['filtered'], level: 'DEBUG' } }, }); const logger = log4js.getLogger(); logger.debug('This should get logged'); logger.debug('Another string that should get logged'); const logEvents = recording.replay(); t.equal(logEvents.length, 2); t.equal(logEvents[0].data[0], 'This should get logged'); t.equal(logEvents[1].data[0], 'Another string that should get logged'); t.end(); } ); /** * test for excluding all the events that contains digits */ batch.test('appender should exclude the events that contains digits', (t) => { log4js.configure({ appenders: { recorder: { type: 'recording' }, filtered: { type: 'noLogFilter', exclude: '\\d', appender: 'recorder', }, }, categories: { default: { appenders: ['filtered'], level: 'DEBUG' } }, }); const logger = log4js.getLogger(); logger.debug('This should get logged'); logger.debug('The 2nd event should not get logged'); logger.debug('The 3rd event should not get logged, such as the 2nd'); const logEvents = recording.replay(); t.equal(logEvents.length, 1); t.equal(logEvents[0].data[0], 'This should get logged'); t.end(); }); /** * test the cases provided in the documentation * https://log4js-node.github.io/log4js-node/noLogFilter.html */ batch.test( 'appender should exclude not valid events according to the documentation', (t) => { log4js.configure({ appenders: { recorder: { type: 'recording' }, filtered: { type: 'noLogFilter', exclude: ['NOT', '\\d', ''], appender: 'recorder', }, }, categories: { default: { appenders: ['filtered'], level: 'DEBUG' } }, }); const logger = log4js.getLogger(); logger.debug('I will be logged in all-the-logs.log'); logger.debug('I will be not logged in all-the-logs.log'); logger.debug('A 2nd message that will be excluded in all-the-logs.log'); logger.debug('Hello again'); const logEvents = recording.replay(); t.equal(logEvents.length, 2); t.equal(logEvents[0].data[0], 'I will be logged in all-the-logs.log'); t.equal(logEvents[1].data[0], 'Hello again'); t.end(); } ); batch.end(); });
const { test } = require('tap'); const log4js = require('../../lib/log4js'); const recording = require('../../lib/appenders/recording'); /** * test a simple regexp */ test('log4js noLogFilter', (batch) => { batch.beforeEach((done) => { recording.reset(); if (typeof done === 'function') { done(); } }); batch.test( 'appender should exclude events that match the regexp string', (t) => { log4js.configure({ appenders: { recorder: { type: 'recording' }, filtered: { type: 'noLogFilter', exclude: 'This.*not', appender: 'recorder', }, }, categories: { default: { appenders: ['filtered'], level: 'DEBUG' } }, }); const logger = log4js.getLogger(); logger.debug('This should not get logged'); logger.debug('This should get logged'); logger.debug( 'Another case that not match the regex, so it should get logged' ); const logEvents = recording.replay(); t.equal(logEvents.length, 2); t.equal(logEvents[0].data[0], 'This should get logged'); t.equal( logEvents[1].data[0], 'Another case that not match the regex, so it should get logged' ); t.end(); } ); /** * test an array of regexp */ batch.test( 'appender should exclude events that match the regexp string contained in the array', (t) => { log4js.configure({ appenders: { recorder: { type: 'recording' }, filtered: { type: 'noLogFilter', exclude: ['This.*not', 'instead'], appender: 'recorder', }, }, categories: { default: { appenders: ['filtered'], level: 'DEBUG' } }, }); const logger = log4js.getLogger(); logger.debug('This should not get logged'); logger.debug('This should get logged'); logger.debug( 'Another case that not match the regex, so it should get logged' ); logger.debug('This case instead it should get logged'); logger.debug('The last that should get logged'); const logEvents = recording.replay(); t.equal(logEvents.length, 3); t.equal(logEvents[0].data[0], 'This should get logged'); t.equal( logEvents[1].data[0], 'Another case that not match the regex, so it should get logged' ); t.equal(logEvents[2].data[0], 'The last that should get logged'); t.end(); } ); /** * test case insentitive regexp */ batch.test( 'appender should evaluate the regexp using incase sentitive option', (t) => { log4js.configure({ appenders: { recorder: { type: 'recording' }, filtered: { type: 'noLogFilter', exclude: ['NOT', 'eX.*de'], appender: 'recorder', }, }, categories: { default: { appenders: ['filtered'], level: 'DEBUG' } }, }); const logger = log4js.getLogger(); logger.debug('This should not get logged'); logger.debug('This should get logged'); logger.debug('Exclude this string'); logger.debug('Include this string'); const logEvents = recording.replay(); t.equal(logEvents.length, 2); t.equal(logEvents[0].data[0], 'This should get logged'); t.equal(logEvents[1].data[0], 'Include this string'); t.end(); } ); /** * test empty string or null regexp */ batch.test( 'appender should skip the match in case of empty or null regexp', (t) => { log4js.configure({ appenders: { recorder: { type: 'recording' }, filtered: { type: 'noLogFilter', exclude: ['', null, undefined], appender: 'recorder', }, }, categories: { default: { appenders: ['filtered'], level: 'DEBUG' } }, }); const logger = log4js.getLogger(); logger.debug('This should get logged'); logger.debug('Another string that should get logged'); const logEvents = recording.replay(); t.equal(logEvents.length, 2); t.equal(logEvents[0].data[0], 'This should get logged'); t.equal(logEvents[1].data[0], 'Another string that should get logged'); t.end(); } ); /** * test for excluding all the events that contains digits */ batch.test('appender should exclude the events that contains digits', (t) => { log4js.configure({ appenders: { recorder: { type: 'recording' }, filtered: { type: 'noLogFilter', exclude: '\\d', appender: 'recorder', }, }, categories: { default: { appenders: ['filtered'], level: 'DEBUG' } }, }); const logger = log4js.getLogger(); logger.debug('This should get logged'); logger.debug('The 2nd event should not get logged'); logger.debug('The 3rd event should not get logged, such as the 2nd'); const logEvents = recording.replay(); t.equal(logEvents.length, 1); t.equal(logEvents[0].data[0], 'This should get logged'); t.end(); }); /** * test the cases provided in the documentation * https://log4js-node.github.io/log4js-node/noLogFilter.html */ batch.test( 'appender should exclude not valid events according to the documentation', (t) => { log4js.configure({ appenders: { recorder: { type: 'recording' }, filtered: { type: 'noLogFilter', exclude: ['NOT', '\\d', ''], appender: 'recorder', }, }, categories: { default: { appenders: ['filtered'], level: 'DEBUG' } }, }); const logger = log4js.getLogger(); logger.debug('I will be logged in all-the-logs.log'); logger.debug('I will be not logged in all-the-logs.log'); logger.debug('A 2nd message that will be excluded in all-the-logs.log'); logger.debug('Hello again'); const logEvents = recording.replay(); t.equal(logEvents.length, 2); t.equal(logEvents[0].data[0], 'I will be logged in all-the-logs.log'); t.equal(logEvents[1].data[0], 'Hello again'); t.end(); } ); batch.end(); });
-1
log4js-node/log4js-node
1,332
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`
Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
lamweili
"2022-10-01T10:52:35Z"
"2022-10-01T10:57:05Z"
916eef11f1d2aa2f32765f956f1f674745feb8b6
570ef530dc02d3e843a5421cb015bb8fadfe0b41
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`. Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
./test/tap/file-descriptor-leak-test.js
const { test } = require('tap'); const fs = require('fs'); const path = require('path'); const log4js = require('../../lib/log4js'); const osDelay = process.platform === 'win32' ? 400 : 200; const removeFiles = async (filenames) => { if (!Array.isArray(filenames)) filenames = [filenames]; const promises = filenames.map((filename) => fs.promises.unlink(filename)); await Promise.allSettled(promises); }; // no file descriptors on Windows, so don't run the tests if (process.platform !== 'win32') { test('multiple log4js configure fd leak test', (batch) => { const config = { appenders: {}, categories: { default: { appenders: [], level: 'debug' }, }, }; // create 11 appenders const numOfAppenders = 11; for (let i = 1; i <= numOfAppenders; i++) { config.appenders[`app${i}`] = { type: 'file', filename: path.join(__dirname, `file${i}.log`), }; config.categories.default.appenders.push(`app${i}`); } const initialFd = fs.readdirSync('/proc/self/fd').length; let loadedFd; batch.test( 'initial log4js configure to increase file descriptor count', (t) => { log4js.configure(config); // wait for the file system to catch up setTimeout(() => { loadedFd = fs.readdirSync('/proc/self/fd').length; t.equal( loadedFd, initialFd + numOfAppenders, `file descriptor count should increase by ${numOfAppenders} after 1st configure() call` ); t.end(); }, osDelay); } ); batch.test( 'repeated log4js configure to not increase file descriptor count', (t) => { log4js.configure(config); log4js.configure(config); log4js.configure(config); // wait for the file system to catch up setTimeout(() => { t.equal( fs.readdirSync('/proc/self/fd').length, loadedFd, `file descriptor count should be identical after repeated configure() calls` ); t.end(); }, osDelay); } ); batch.test( 'file descriptor count should return back to initial count', (t) => { log4js.shutdown(); // wait for the file system to catch up setTimeout(() => { t.equal( fs.readdirSync('/proc/self/fd').length, initialFd, `file descriptor count should be back to initial` ); t.end(); }, osDelay); } ); batch.teardown(async () => { await new Promise((resolve) => { log4js.shutdown(resolve); }); const filenames = Object.values(config.appenders).map( (appender) => appender.filename ); await removeFiles(filenames); }); batch.end(); }); }
const { test } = require('tap'); const fs = require('fs'); const path = require('path'); const log4js = require('../../lib/log4js'); const osDelay = process.platform === 'win32' ? 400 : 200; const removeFiles = async (filenames) => { if (!Array.isArray(filenames)) filenames = [filenames]; const promises = filenames.map((filename) => fs.promises.unlink(filename)); await Promise.allSettled(promises); }; // no file descriptors on Windows, so don't run the tests if (process.platform !== 'win32') { test('multiple log4js configure fd leak test', (batch) => { const config = { appenders: {}, categories: { default: { appenders: [], level: 'debug' }, }, }; // create 11 appenders const numOfAppenders = 11; for (let i = 1; i <= numOfAppenders; i++) { config.appenders[`app${i}`] = { type: 'file', filename: path.join(__dirname, `file${i}.log`), }; config.categories.default.appenders.push(`app${i}`); } const initialFd = fs.readdirSync('/proc/self/fd').length; let loadedFd; batch.test( 'initial log4js configure to increase file descriptor count', (t) => { log4js.configure(config); // wait for the file system to catch up setTimeout(() => { loadedFd = fs.readdirSync('/proc/self/fd').length; t.equal( loadedFd, initialFd + numOfAppenders, `file descriptor count should increase by ${numOfAppenders} after 1st configure() call` ); t.end(); }, osDelay); } ); batch.test( 'repeated log4js configure to not increase file descriptor count', (t) => { log4js.configure(config); log4js.configure(config); log4js.configure(config); // wait for the file system to catch up setTimeout(() => { t.equal( fs.readdirSync('/proc/self/fd').length, loadedFd, `file descriptor count should be identical after repeated configure() calls` ); t.end(); }, osDelay); } ); batch.test( 'file descriptor count should return back to initial count', (t) => { log4js.shutdown(); // wait for the file system to catch up setTimeout(() => { t.equal( fs.readdirSync('/proc/self/fd').length, initialFd, `file descriptor count should be back to initial` ); t.end(); }, osDelay); } ); batch.teardown(async () => { await new Promise((resolve) => { log4js.shutdown(resolve); }); const filenames = Object.values(config.appenders).map( (appender) => appender.filename ); await removeFiles(filenames); }); batch.end(); }); }
-1
log4js-node/log4js-node
1,332
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`
Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
lamweili
"2022-10-01T10:52:35Z"
"2022-10-01T10:57:05Z"
916eef11f1d2aa2f32765f956f1f674745feb8b6
570ef530dc02d3e843a5421cb015bb8fadfe0b41
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`. Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
./lib/appenders/tcp-server.js
const debug = require('debug')('log4js:tcp-server'); const net = require('net'); const clustering = require('../clustering'); const LoggingEvent = require('../LoggingEvent'); const DELIMITER = '__LOG4JS__'; exports.configure = (config) => { debug('configure called with ', config); const server = net.createServer((socket) => { let dataSoFar = ''; const send = (data) => { if (data) { dataSoFar += data; if (dataSoFar.indexOf(DELIMITER)) { const events = dataSoFar.split(DELIMITER); if (!dataSoFar.endsWith(DELIMITER)) { dataSoFar = events.pop(); } else { dataSoFar = ''; } events .filter((e) => e.length) .forEach((e) => { clustering.send(LoggingEvent.deserialise(e)); }); } else { dataSoFar = ''; } } }; socket.setEncoding('utf8'); socket.on('data', send); socket.on('end', send); }); server.listen(config.port || 5000, config.host || 'localhost', () => { debug(`listening on ${config.host || 'localhost'}:${config.port || 5000}`); server.unref(); }); return { shutdown: (cb) => { debug('shutdown called.'); server.close(cb); }, }; };
const debug = require('debug')('log4js:tcp-server'); const net = require('net'); const clustering = require('../clustering'); const LoggingEvent = require('../LoggingEvent'); const DELIMITER = '__LOG4JS__'; exports.configure = (config) => { debug('configure called with ', config); const server = net.createServer((socket) => { let dataSoFar = ''; const send = (data) => { if (data) { dataSoFar += data; if (dataSoFar.indexOf(DELIMITER)) { const events = dataSoFar.split(DELIMITER); if (!dataSoFar.endsWith(DELIMITER)) { dataSoFar = events.pop(); } else { dataSoFar = ''; } events .filter((e) => e.length) .forEach((e) => { clustering.send(LoggingEvent.deserialise(e)); }); } else { dataSoFar = ''; } } }; socket.setEncoding('utf8'); socket.on('data', send); socket.on('end', send); }); server.listen(config.port || 5000, config.host || 'localhost', () => { debug(`listening on ${config.host || 'localhost'}:${config.port || 5000}`); server.unref(); }); return { shutdown: (cb) => { debug('shutdown called.'); server.close(cb); }, }; };
-1
log4js-node/log4js-node
1,332
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`
Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
lamweili
"2022-10-01T10:52:35Z"
"2022-10-01T10:57:05Z"
916eef11f1d2aa2f32765f956f1f674745feb8b6
570ef530dc02d3e843a5421cb015bb8fadfe0b41
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`. Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
./lib/appenders/categoryFilter.js
const debug = require('debug')('log4js:categoryFilter'); function categoryFilter(excludes, appender) { if (typeof excludes === 'string') excludes = [excludes]; return (logEvent) => { debug(`Checking ${logEvent.categoryName} against ${excludes}`); if (excludes.indexOf(logEvent.categoryName) === -1) { debug('Not excluded, sending to appender'); appender(logEvent); } }; } function configure(config, layouts, findAppender) { const appender = findAppender(config.appender); return categoryFilter(config.exclude, appender); } module.exports.configure = configure;
const debug = require('debug')('log4js:categoryFilter'); function categoryFilter(excludes, appender) { if (typeof excludes === 'string') excludes = [excludes]; return (logEvent) => { debug(`Checking ${logEvent.categoryName} against ${excludes}`); if (excludes.indexOf(logEvent.categoryName) === -1) { debug('Not excluded, sending to appender'); appender(logEvent); } }; } function configure(config, layouts, findAppender) { const appender = findAppender(config.appender); return categoryFilter(config.exclude, appender); } module.exports.configure = configure;
-1
log4js-node/log4js-node
1,332
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`
Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
lamweili
"2022-10-01T10:52:35Z"
"2022-10-01T10:57:05Z"
916eef11f1d2aa2f32765f956f1f674745feb8b6
570ef530dc02d3e843a5421cb015bb8fadfe0b41
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`. Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
./test/tap/disable-cluster-test.js
const { test } = require('tap'); const cluster = require('cluster'); const log4js = require('../../lib/log4js'); const recorder = require('../../lib/appenders/recording'); cluster.removeAllListeners(); log4js.configure({ appenders: { vcr: { type: 'recording' }, }, categories: { default: { appenders: ['vcr'], level: 'debug' } }, disableClustering: true, }); if (cluster.isMaster) { cluster.fork(); const masterLogger = log4js.getLogger('master'); const masterPid = process.pid; masterLogger.info('this is master'); cluster.on('exit', () => { const logEvents = recorder.replay(); test('cluster master', (batch) => { batch.test('only master events should be logged', (t) => { t.equal(logEvents.length, 1); t.equal(logEvents[0].categoryName, 'master'); t.equal(logEvents[0].pid, masterPid); t.equal(logEvents[0].data[0], 'this is master'); t.end(); }); batch.end(); }); }); } else { const workerLogger = log4js.getLogger('worker'); workerLogger.info('this is worker', new Error('oh dear')); const workerEvents = recorder.replay(); test('cluster worker', (batch) => { batch.test('should send events to its own appender', (t) => { t.equal(workerEvents.length, 1); t.equal(workerEvents[0].categoryName, 'worker'); t.equal(workerEvents[0].data[0], 'this is worker'); t.type(workerEvents[0].data[1], 'Error'); t.match(workerEvents[0].data[1].stack, 'Error: oh dear'); t.end(); }); batch.end(); }); // test sending a cluster-style log message process.send({ topic: 'log4js:message', data: { cheese: 'gouda' } }); cluster.worker.disconnect(); }
const { test } = require('tap'); const cluster = require('cluster'); const log4js = require('../../lib/log4js'); const recorder = require('../../lib/appenders/recording'); cluster.removeAllListeners(); log4js.configure({ appenders: { vcr: { type: 'recording' }, }, categories: { default: { appenders: ['vcr'], level: 'debug' } }, disableClustering: true, }); if (cluster.isMaster) { cluster.fork(); const masterLogger = log4js.getLogger('master'); const masterPid = process.pid; masterLogger.info('this is master'); cluster.on('exit', () => { const logEvents = recorder.replay(); test('cluster master', (batch) => { batch.test('only master events should be logged', (t) => { t.equal(logEvents.length, 1); t.equal(logEvents[0].categoryName, 'master'); t.equal(logEvents[0].pid, masterPid); t.equal(logEvents[0].data[0], 'this is master'); t.end(); }); batch.end(); }); }); } else { const workerLogger = log4js.getLogger('worker'); workerLogger.info('this is worker', new Error('oh dear')); const workerEvents = recorder.replay(); test('cluster worker', (batch) => { batch.test('should send events to its own appender', (t) => { t.equal(workerEvents.length, 1); t.equal(workerEvents[0].categoryName, 'worker'); t.equal(workerEvents[0].data[0], 'this is worker'); t.type(workerEvents[0].data[1], 'Error'); t.match(workerEvents[0].data[1].stack, 'Error: oh dear'); t.end(); }); batch.end(); }); // test sending a cluster-style log message process.send({ topic: 'log4js:message', data: { cheese: 'gouda' } }); cluster.worker.disconnect(); }
-1
log4js-node/log4js-node
1,332
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`
Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
lamweili
"2022-10-01T10:52:35Z"
"2022-10-01T10:57:05Z"
916eef11f1d2aa2f32765f956f1f674745feb8b6
570ef530dc02d3e843a5421cb015bb8fadfe0b41
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`. Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
./lib/categories.js
const debug = require('debug')('log4js:categories'); const configuration = require('./configuration'); const levels = require('./levels'); const appenders = require('./appenders'); const categories = new Map(); /** * Add inherited config to this category. That includes extra appenders from parent, * and level, if none is set on this category. * This is recursive, so each parent also gets loaded with inherited appenders. * Inheritance is blocked if a category has inherit=false * @param {*} config * @param {*} category the child category * @param {string} categoryName dotted path to category * @return {void} */ function inheritFromParent(config, category, categoryName) { if (category.inherit === false) return; const lastDotIndex = categoryName.lastIndexOf('.'); if (lastDotIndex < 0) return; // category is not a child const parentCategoryName = categoryName.slice(0, lastDotIndex); let parentCategory = config.categories[parentCategoryName]; if (!parentCategory) { // parent is missing, so implicitly create it, so that it can inherit from its parents parentCategory = { inherit: true, appenders: [] }; } // make sure parent has had its inheritance taken care of before pulling its properties to this child inheritFromParent(config, parentCategory, parentCategoryName); // if the parent is not in the config (because we just created it above), // and it inherited a valid configuration, add it to config.categories if ( !config.categories[parentCategoryName] && parentCategory.appenders && parentCategory.appenders.length && parentCategory.level ) { config.categories[parentCategoryName] = parentCategory; } category.appenders = category.appenders || []; category.level = category.level || parentCategory.level; // merge in appenders from parent (parent is already holding its inherited appenders) parentCategory.appenders.forEach((ap) => { if (!category.appenders.includes(ap)) { category.appenders.push(ap); } }); category.parent = parentCategory; } /** * Walk all categories in the config, and pull down any configuration from parent to child. * This includes inherited appenders, and level, where level is not set. * Inheritance is skipped where a category has inherit=false. * @param {*} config */ function addCategoryInheritance(config) { if (!config.categories) return; const categoryNames = Object.keys(config.categories); categoryNames.forEach((name) => { const category = config.categories[name]; // add inherited appenders and level to this category inheritFromParent(config, category, name); }); } configuration.addPreProcessingListener((config) => addCategoryInheritance(config) ); configuration.addListener((config) => { configuration.throwExceptionIf( config, configuration.not(configuration.anObject(config.categories)), 'must have a property "categories" of type object.' ); const categoryNames = Object.keys(config.categories); configuration.throwExceptionIf( config, configuration.not(categoryNames.length), 'must define at least one category.' ); categoryNames.forEach((name) => { const category = config.categories[name]; configuration.throwExceptionIf( config, [ configuration.not(category.appenders), configuration.not(category.level), ], `category "${name}" is not valid (must be an object with properties "appenders" and "level")` ); configuration.throwExceptionIf( config, configuration.not(Array.isArray(category.appenders)), `category "${name}" is not valid (appenders must be an array of appender names)` ); configuration.throwExceptionIf( config, configuration.not(category.appenders.length), `category "${name}" is not valid (appenders must contain at least one appender name)` ); if (Object.prototype.hasOwnProperty.call(category, 'enableCallStack')) { configuration.throwExceptionIf( config, typeof category.enableCallStack !== 'boolean', `category "${name}" is not valid (enableCallStack must be boolean type)` ); } category.appenders.forEach((appender) => { configuration.throwExceptionIf( config, configuration.not(appenders.get(appender)), `category "${name}" is not valid (appender "${appender}" is not defined)` ); }); configuration.throwExceptionIf( config, configuration.not(levels.getLevel(category.level)), `category "${name}" is not valid (level "${category.level}" not recognised;` + ` valid levels are ${levels.levels.join(', ')})` ); }); configuration.throwExceptionIf( config, configuration.not(config.categories.default), 'must define a "default" category.' ); }); const setup = (config) => { categories.clear(); if (!config) { return; } const categoryNames = Object.keys(config.categories); categoryNames.forEach((name) => { const category = config.categories[name]; const categoryAppenders = []; category.appenders.forEach((appender) => { categoryAppenders.push(appenders.get(appender)); debug(`Creating category ${name}`); categories.set(name, { appenders: categoryAppenders, level: levels.getLevel(category.level), enableCallStack: category.enableCallStack || false, }); }); }); }; const init = () => { setup(); }; init(); configuration.addListener(setup); const configForCategory = (category) => { debug(`configForCategory: searching for config for ${category}`); if (categories.has(category)) { debug(`configForCategory: ${category} exists in config, returning it`); return categories.get(category); } let sourceCategoryConfig; if (category.indexOf('.') > 0) { debug(`configForCategory: ${category} has hierarchy, cloning from parents`); sourceCategoryConfig = { ...configForCategory(category.slice(0, category.lastIndexOf('.'))), }; } else { if (!categories.has('default')) { setup({ categories: { default: { appenders: ['out'], level: 'OFF' } } }); } debug('configForCategory: cloning default category'); sourceCategoryConfig = { ...categories.get('default') }; } categories.set(category, sourceCategoryConfig); return sourceCategoryConfig; }; const appendersForCategory = (category) => configForCategory(category).appenders; const getLevelForCategory = (category) => configForCategory(category).level; const setLevelForCategory = (category, level) => { configForCategory(category).level = level; }; const getEnableCallStackForCategory = (category) => configForCategory(category).enableCallStack === true; const setEnableCallStackForCategory = (category, useCallStack) => { configForCategory(category).enableCallStack = useCallStack; }; module.exports = categories; module.exports = Object.assign(module.exports, { appendersForCategory, getLevelForCategory, setLevelForCategory, getEnableCallStackForCategory, setEnableCallStackForCategory, init, });
const debug = require('debug')('log4js:categories'); const configuration = require('./configuration'); const levels = require('./levels'); const appenders = require('./appenders'); const categories = new Map(); /** * Add inherited config to this category. That includes extra appenders from parent, * and level, if none is set on this category. * This is recursive, so each parent also gets loaded with inherited appenders. * Inheritance is blocked if a category has inherit=false * @param {*} config * @param {*} category the child category * @param {string} categoryName dotted path to category * @return {void} */ function inheritFromParent(config, category, categoryName) { if (category.inherit === false) return; const lastDotIndex = categoryName.lastIndexOf('.'); if (lastDotIndex < 0) return; // category is not a child const parentCategoryName = categoryName.slice(0, lastDotIndex); let parentCategory = config.categories[parentCategoryName]; if (!parentCategory) { // parent is missing, so implicitly create it, so that it can inherit from its parents parentCategory = { inherit: true, appenders: [] }; } // make sure parent has had its inheritance taken care of before pulling its properties to this child inheritFromParent(config, parentCategory, parentCategoryName); // if the parent is not in the config (because we just created it above), // and it inherited a valid configuration, add it to config.categories if ( !config.categories[parentCategoryName] && parentCategory.appenders && parentCategory.appenders.length && parentCategory.level ) { config.categories[parentCategoryName] = parentCategory; } category.appenders = category.appenders || []; category.level = category.level || parentCategory.level; // merge in appenders from parent (parent is already holding its inherited appenders) parentCategory.appenders.forEach((ap) => { if (!category.appenders.includes(ap)) { category.appenders.push(ap); } }); category.parent = parentCategory; } /** * Walk all categories in the config, and pull down any configuration from parent to child. * This includes inherited appenders, and level, where level is not set. * Inheritance is skipped where a category has inherit=false. * @param {*} config */ function addCategoryInheritance(config) { if (!config.categories) return; const categoryNames = Object.keys(config.categories); categoryNames.forEach((name) => { const category = config.categories[name]; // add inherited appenders and level to this category inheritFromParent(config, category, name); }); } configuration.addPreProcessingListener((config) => addCategoryInheritance(config) ); configuration.addListener((config) => { configuration.throwExceptionIf( config, configuration.not(configuration.anObject(config.categories)), 'must have a property "categories" of type object.' ); const categoryNames = Object.keys(config.categories); configuration.throwExceptionIf( config, configuration.not(categoryNames.length), 'must define at least one category.' ); categoryNames.forEach((name) => { const category = config.categories[name]; configuration.throwExceptionIf( config, [ configuration.not(category.appenders), configuration.not(category.level), ], `category "${name}" is not valid (must be an object with properties "appenders" and "level")` ); configuration.throwExceptionIf( config, configuration.not(Array.isArray(category.appenders)), `category "${name}" is not valid (appenders must be an array of appender names)` ); configuration.throwExceptionIf( config, configuration.not(category.appenders.length), `category "${name}" is not valid (appenders must contain at least one appender name)` ); if (Object.prototype.hasOwnProperty.call(category, 'enableCallStack')) { configuration.throwExceptionIf( config, typeof category.enableCallStack !== 'boolean', `category "${name}" is not valid (enableCallStack must be boolean type)` ); } category.appenders.forEach((appender) => { configuration.throwExceptionIf( config, configuration.not(appenders.get(appender)), `category "${name}" is not valid (appender "${appender}" is not defined)` ); }); configuration.throwExceptionIf( config, configuration.not(levels.getLevel(category.level)), `category "${name}" is not valid (level "${category.level}" not recognised;` + ` valid levels are ${levels.levels.join(', ')})` ); }); configuration.throwExceptionIf( config, configuration.not(config.categories.default), 'must define a "default" category.' ); }); const setup = (config) => { categories.clear(); if (!config) { return; } const categoryNames = Object.keys(config.categories); categoryNames.forEach((name) => { const category = config.categories[name]; const categoryAppenders = []; category.appenders.forEach((appender) => { categoryAppenders.push(appenders.get(appender)); debug(`Creating category ${name}`); categories.set(name, { appenders: categoryAppenders, level: levels.getLevel(category.level), enableCallStack: category.enableCallStack || false, }); }); }); }; const init = () => { setup(); }; init(); configuration.addListener(setup); const configForCategory = (category) => { debug(`configForCategory: searching for config for ${category}`); if (categories.has(category)) { debug(`configForCategory: ${category} exists in config, returning it`); return categories.get(category); } let sourceCategoryConfig; if (category.indexOf('.') > 0) { debug(`configForCategory: ${category} has hierarchy, cloning from parents`); sourceCategoryConfig = { ...configForCategory(category.slice(0, category.lastIndexOf('.'))), }; } else { if (!categories.has('default')) { setup({ categories: { default: { appenders: ['out'], level: 'OFF' } } }); } debug('configForCategory: cloning default category'); sourceCategoryConfig = { ...categories.get('default') }; } categories.set(category, sourceCategoryConfig); return sourceCategoryConfig; }; const appendersForCategory = (category) => configForCategory(category).appenders; const getLevelForCategory = (category) => configForCategory(category).level; const setLevelForCategory = (category, level) => { configForCategory(category).level = level; }; const getEnableCallStackForCategory = (category) => configForCategory(category).enableCallStack === true; const setEnableCallStackForCategory = (category, useCallStack) => { configForCategory(category).enableCallStack = useCallStack; }; module.exports = categories; module.exports = Object.assign(module.exports, { appendersForCategory, getLevelForCategory, setLevelForCategory, getEnableCallStackForCategory, setEnableCallStackForCategory, init, });
-1
log4js-node/log4js-node
1,332
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`
Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
lamweili
"2022-10-01T10:52:35Z"
"2022-10-01T10:57:05Z"
916eef11f1d2aa2f32765f956f1f674745feb8b6
570ef530dc02d3e843a5421cb015bb8fadfe0b41
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`. Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
./lib/appenders/index.js
const path = require('path'); const debug = require('debug')('log4js:appenders'); const configuration = require('../configuration'); const clustering = require('../clustering'); const levels = require('../levels'); const layouts = require('../layouts'); const adapters = require('./adapters'); // pre-load the core appenders so that webpack can find them const coreAppenders = new Map(); coreAppenders.set('console', require('./console')); coreAppenders.set('stdout', require('./stdout')); coreAppenders.set('stderr', require('./stderr')); coreAppenders.set('logLevelFilter', require('./logLevelFilter')); coreAppenders.set('categoryFilter', require('./categoryFilter')); coreAppenders.set('noLogFilter', require('./noLogFilter')); coreAppenders.set('file', require('./file')); coreAppenders.set('dateFile', require('./dateFile')); coreAppenders.set('fileSync', require('./fileSync')); coreAppenders.set('tcp', require('./tcp')); const appenders = new Map(); const tryLoading = (modulePath, config) => { let resolvedPath; try { const modulePathCJS = `${modulePath}.cjs`; resolvedPath = require.resolve(modulePathCJS); debug('Loading module from ', modulePathCJS); } catch (e) { resolvedPath = modulePath; debug('Loading module from ', modulePath); } try { // eslint-disable-next-line global-require, import/no-dynamic-require return require(resolvedPath); } catch (e) { // if the module was found, and we still got an error, then raise it configuration.throwExceptionIf( config, e.code !== 'MODULE_NOT_FOUND', `appender "${modulePath}" could not be loaded (error was: ${e})` ); return undefined; } }; const loadAppenderModule = (type, config) => coreAppenders.get(type) || tryLoading(`./${type}`, config) || tryLoading(type, config) || (require.main && require.main.filename && tryLoading(path.join(path.dirname(require.main.filename), type), config)) || tryLoading(path.join(process.cwd(), type), config); const appendersLoading = new Set(); const getAppender = (name, config) => { if (appenders.has(name)) return appenders.get(name); if (!config.appenders[name]) return false; if (appendersLoading.has(name)) throw new Error(`Dependency loop detected for appender ${name}.`); appendersLoading.add(name); debug(`Creating appender ${name}`); // eslint-disable-next-line no-use-before-define const appender = createAppender(name, config); appendersLoading.delete(name); appenders.set(name, appender); return appender; }; const createAppender = (name, config) => { const appenderConfig = config.appenders[name]; const appenderModule = appenderConfig.type.configure ? appenderConfig.type : loadAppenderModule(appenderConfig.type, config); configuration.throwExceptionIf( config, configuration.not(appenderModule), `appender "${name}" is not valid (type "${appenderConfig.type}" could not be found)` ); if (appenderModule.appender) { process.emitWarning( `Appender ${appenderConfig.type} exports an appender function.`, 'DeprecationWarning', 'log4js-node-DEP0001' ); debug( '[log4js-node-DEP0001]', `DEPRECATION: Appender ${appenderConfig.type} exports an appender function.` ); } if (appenderModule.shutdown) { process.emitWarning( `Appender ${appenderConfig.type} exports a shutdown function.`, 'DeprecationWarning', 'log4js-node-DEP0002' ); debug( '[log4js-node-DEP0002]', `DEPRECATION: Appender ${appenderConfig.type} exports a shutdown function.` ); } debug(`${name}: clustering.isMaster ? ${clustering.isMaster()}`); debug( // eslint-disable-next-line global-require `${name}: appenderModule is ${require('util').inspect(appenderModule)}` ); return clustering.onlyOnMaster( () => { debug( `calling appenderModule.configure for ${name} / ${appenderConfig.type}` ); return appenderModule.configure( adapters.modifyConfig(appenderConfig), layouts, (appender) => getAppender(appender, config), levels ); }, /* istanbul ignore next: fn never gets called by non-master yet needed to pass config validation */ () => {} ); }; const setup = (config) => { appenders.clear(); appendersLoading.clear(); if (!config) { return; } const usedAppenders = []; Object.values(config.categories).forEach((category) => { usedAppenders.push(...category.appenders); }); Object.keys(config.appenders).forEach((name) => { // dodgy hard-coding of special case for tcp-server and multiprocess which may not have // any categories associated with it, but needs to be started up anyway if ( usedAppenders.includes(name) || config.appenders[name].type === 'tcp-server' || config.appenders[name].type === 'multiprocess' ) { getAppender(name, config); } }); }; const init = () => { setup(); }; init(); configuration.addListener((config) => { configuration.throwExceptionIf( config, configuration.not(configuration.anObject(config.appenders)), 'must have a property "appenders" of type object.' ); const appenderNames = Object.keys(config.appenders); configuration.throwExceptionIf( config, configuration.not(appenderNames.length), 'must define at least one appender.' ); appenderNames.forEach((name) => { configuration.throwExceptionIf( config, configuration.not(config.appenders[name].type), `appender "${name}" is not valid (must be an object with property "type")` ); }); }); configuration.addListener(setup); module.exports = appenders; module.exports.init = init;
const path = require('path'); const debug = require('debug')('log4js:appenders'); const configuration = require('../configuration'); const clustering = require('../clustering'); const levels = require('../levels'); const layouts = require('../layouts'); const adapters = require('./adapters'); // pre-load the core appenders so that webpack can find them const coreAppenders = new Map(); coreAppenders.set('console', require('./console')); coreAppenders.set('stdout', require('./stdout')); coreAppenders.set('stderr', require('./stderr')); coreAppenders.set('logLevelFilter', require('./logLevelFilter')); coreAppenders.set('categoryFilter', require('./categoryFilter')); coreAppenders.set('noLogFilter', require('./noLogFilter')); coreAppenders.set('file', require('./file')); coreAppenders.set('dateFile', require('./dateFile')); coreAppenders.set('fileSync', require('./fileSync')); coreAppenders.set('tcp', require('./tcp')); const appenders = new Map(); const tryLoading = (modulePath, config) => { let resolvedPath; try { const modulePathCJS = `${modulePath}.cjs`; resolvedPath = require.resolve(modulePathCJS); debug('Loading module from ', modulePathCJS); } catch (e) { resolvedPath = modulePath; debug('Loading module from ', modulePath); } try { // eslint-disable-next-line global-require, import/no-dynamic-require return require(resolvedPath); } catch (e) { // if the module was found, and we still got an error, then raise it configuration.throwExceptionIf( config, e.code !== 'MODULE_NOT_FOUND', `appender "${modulePath}" could not be loaded (error was: ${e})` ); return undefined; } }; const loadAppenderModule = (type, config) => coreAppenders.get(type) || tryLoading(`./${type}`, config) || tryLoading(type, config) || (require.main && require.main.filename && tryLoading(path.join(path.dirname(require.main.filename), type), config)) || tryLoading(path.join(process.cwd(), type), config); const appendersLoading = new Set(); const getAppender = (name, config) => { if (appenders.has(name)) return appenders.get(name); if (!config.appenders[name]) return false; if (appendersLoading.has(name)) throw new Error(`Dependency loop detected for appender ${name}.`); appendersLoading.add(name); debug(`Creating appender ${name}`); // eslint-disable-next-line no-use-before-define const appender = createAppender(name, config); appendersLoading.delete(name); appenders.set(name, appender); return appender; }; const createAppender = (name, config) => { const appenderConfig = config.appenders[name]; const appenderModule = appenderConfig.type.configure ? appenderConfig.type : loadAppenderModule(appenderConfig.type, config); configuration.throwExceptionIf( config, configuration.not(appenderModule), `appender "${name}" is not valid (type "${appenderConfig.type}" could not be found)` ); if (appenderModule.appender) { process.emitWarning( `Appender ${appenderConfig.type} exports an appender function.`, 'DeprecationWarning', 'log4js-node-DEP0001' ); debug( '[log4js-node-DEP0001]', `DEPRECATION: Appender ${appenderConfig.type} exports an appender function.` ); } if (appenderModule.shutdown) { process.emitWarning( `Appender ${appenderConfig.type} exports a shutdown function.`, 'DeprecationWarning', 'log4js-node-DEP0002' ); debug( '[log4js-node-DEP0002]', `DEPRECATION: Appender ${appenderConfig.type} exports a shutdown function.` ); } debug(`${name}: clustering.isMaster ? ${clustering.isMaster()}`); debug( // eslint-disable-next-line global-require `${name}: appenderModule is ${require('util').inspect(appenderModule)}` ); return clustering.onlyOnMaster( () => { debug( `calling appenderModule.configure for ${name} / ${appenderConfig.type}` ); return appenderModule.configure( adapters.modifyConfig(appenderConfig), layouts, (appender) => getAppender(appender, config), levels ); }, /* istanbul ignore next: fn never gets called by non-master yet needed to pass config validation */ () => {} ); }; const setup = (config) => { appenders.clear(); appendersLoading.clear(); if (!config) { return; } const usedAppenders = []; Object.values(config.categories).forEach((category) => { usedAppenders.push(...category.appenders); }); Object.keys(config.appenders).forEach((name) => { // dodgy hard-coding of special case for tcp-server and multiprocess which may not have // any categories associated with it, but needs to be started up anyway if ( usedAppenders.includes(name) || config.appenders[name].type === 'tcp-server' || config.appenders[name].type === 'multiprocess' ) { getAppender(name, config); } }); }; const init = () => { setup(); }; init(); configuration.addListener((config) => { configuration.throwExceptionIf( config, configuration.not(configuration.anObject(config.appenders)), 'must have a property "appenders" of type object.' ); const appenderNames = Object.keys(config.appenders); configuration.throwExceptionIf( config, configuration.not(appenderNames.length), 'must define at least one appender.' ); appenderNames.forEach((name) => { configuration.throwExceptionIf( config, configuration.not(config.appenders[name].type), `appender "${name}" is not valid (must be an object with property "type")` ); }); }); configuration.addListener(setup); module.exports = appenders; module.exports.init = init;
-1
log4js-node/log4js-node
1,332
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`
Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
lamweili
"2022-10-01T10:52:35Z"
"2022-10-01T10:57:05Z"
916eef11f1d2aa2f32765f956f1f674745feb8b6
570ef530dc02d3e843a5421cb015bb8fadfe0b41
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`. Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
./test/tap/setLevel-asymmetry-test.js
// This test shows an asymmetry between setLevel and isLevelEnabled // (in log4js-node@0.4.3 and earlier): // 1) setLevel("foo") works, but setLevel(log4js.levels.foo) silently // does not (sets the level to TRACE). // 2) isLevelEnabled("foo") works as does isLevelEnabled(log4js.levels.foo). // const { test } = require('tap'); const log4js = require('../../lib/log4js'); const logger = log4js.getLogger('test-setLevel-asymmetry'); // Define the array of levels as string to iterate over. const strLevels = ['Trace', 'Debug', 'Info', 'Warn', 'Error', 'Fatal']; const log4jsLevels = strLevels.map(log4js.levels.getLevel); test('log4js setLevel', (batch) => { strLevels.forEach((strLevel) => { batch.test(`is called with a ${strLevel} as string`, (t) => { const log4jsLevel = log4js.levels.getLevel(strLevel); t.test('should convert string to level correctly', (assert) => { logger.level = strLevel; log4jsLevels.forEach((level) => { assert.equal( logger.isLevelEnabled(level), log4jsLevel.isLessThanOrEqualTo(level) ); }); assert.end(); }); t.test('should also accept a Level', (assert) => { logger.level = log4jsLevel; log4jsLevels.forEach((level) => { assert.equal( logger.isLevelEnabled(level), log4jsLevel.isLessThanOrEqualTo(level) ); }); assert.end(); }); t.end(); }); }); batch.end(); });
// This test shows an asymmetry between setLevel and isLevelEnabled // (in log4js-node@0.4.3 and earlier): // 1) setLevel("foo") works, but setLevel(log4js.levels.foo) silently // does not (sets the level to TRACE). // 2) isLevelEnabled("foo") works as does isLevelEnabled(log4js.levels.foo). // const { test } = require('tap'); const log4js = require('../../lib/log4js'); const logger = log4js.getLogger('test-setLevel-asymmetry'); // Define the array of levels as string to iterate over. const strLevels = ['Trace', 'Debug', 'Info', 'Warn', 'Error', 'Fatal']; const log4jsLevels = strLevels.map(log4js.levels.getLevel); test('log4js setLevel', (batch) => { strLevels.forEach((strLevel) => { batch.test(`is called with a ${strLevel} as string`, (t) => { const log4jsLevel = log4js.levels.getLevel(strLevel); t.test('should convert string to level correctly', (assert) => { logger.level = strLevel; log4jsLevels.forEach((level) => { assert.equal( logger.isLevelEnabled(level), log4jsLevel.isLessThanOrEqualTo(level) ); }); assert.end(); }); t.test('should also accept a Level', (assert) => { logger.level = log4jsLevel; log4jsLevels.forEach((level) => { assert.equal( logger.isLevelEnabled(level), log4jsLevel.isLessThanOrEqualTo(level) ); }); assert.end(); }); t.end(); }); }); batch.end(); });
-1
log4js-node/log4js-node
1,332
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`
Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
lamweili
"2022-10-01T10:52:35Z"
"2022-10-01T10:57:05Z"
916eef11f1d2aa2f32765f956f1f674745feb8b6
570ef530dc02d3e843a5421cb015bb8fadfe0b41
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`. Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
./test/tap/pause-test.js
const tap = require('tap'); const fs = require('fs'); const log4js = require('../../lib/log4js'); const removeFiles = async (filenames) => { if (!Array.isArray(filenames)) filenames = [filenames]; const promises = filenames.map((filename) => fs.promises.unlink(filename)); await Promise.allSettled(promises); }; tap.test('Drain event test', (batch) => { batch.test( 'Should emit pause event and resume when logging in a file with high frequency', (t) => { t.teardown(async () => { process.off( 'log4js:pause', process.listeners('log4js:pause')[ process.listeners('log4js:pause').length - 1 ] ); await removeFiles('logs/drain.log'); }); // Generate logger with 5k of highWaterMark config log4js.configure({ appenders: { file: { type: 'file', filename: 'logs/drain.log', highWaterMark: 5 * 1024, }, }, categories: { default: { appenders: ['file'], level: 'debug' }, }, }); let paused = false; let resumed = false; process.on('log4js:pause', (value) => { if (value) { paused = true; t.ok(value, 'log4js:pause, true'); } else { resumed = true; t.ok(!value, 'log4js:pause, false'); t.end(); } }); const logger = log4js.getLogger(); while (!paused && !resumed) { if (!paused) { logger.info('This is a test for emitting drain event'); } } } ); batch.test( 'Should emit pause event and resume when logging in a date file with high frequency', (t) => { t.teardown(async () => { process.off( 'log4js:pause', process.listeners('log4js:pause')[ process.listeners('log4js:pause').length - 1 ] ); await removeFiles('logs/date-file-drain.log'); }); // Generate date file logger with 5kb of highWaterMark config log4js.configure({ appenders: { file: { type: 'dateFile', filename: 'logs/date-file-drain.log', highWaterMark: 5 * 1024, }, }, categories: { default: { appenders: ['file'], level: 'debug' }, }, }); let paused = false; let resumed = false; process.on('log4js:pause', (value) => { if (value) { paused = true; t.ok(value, 'log4js:pause, true'); } else { resumed = true; t.ok(!value, 'log4js:pause, false'); t.end(); } }); const logger = log4js.getLogger(); while (!paused && !resumed) { if (!paused) logger.info( 'This is a test for emitting drain event in date file logger' ); } } ); batch.teardown(async () => { try { const files = fs.readdirSync('logs'); await removeFiles(files.map((filename) => `logs/${filename}`)); fs.rmdirSync('logs'); } catch (e) { // doesn't matter } }); batch.end(); });
const tap = require('tap'); const fs = require('fs'); const log4js = require('../../lib/log4js'); const removeFiles = async (filenames) => { if (!Array.isArray(filenames)) filenames = [filenames]; const promises = filenames.map((filename) => fs.promises.unlink(filename)); await Promise.allSettled(promises); }; tap.test('Drain event test', (batch) => { batch.test( 'Should emit pause event and resume when logging in a file with high frequency', (t) => { t.teardown(async () => { process.off( 'log4js:pause', process.listeners('log4js:pause')[ process.listeners('log4js:pause').length - 1 ] ); await removeFiles('logs/drain.log'); }); // Generate logger with 5k of highWaterMark config log4js.configure({ appenders: { file: { type: 'file', filename: 'logs/drain.log', highWaterMark: 5 * 1024, }, }, categories: { default: { appenders: ['file'], level: 'debug' }, }, }); let paused = false; let resumed = false; process.on('log4js:pause', (value) => { if (value) { paused = true; t.ok(value, 'log4js:pause, true'); } else { resumed = true; t.ok(!value, 'log4js:pause, false'); t.end(); } }); const logger = log4js.getLogger(); while (!paused && !resumed) { if (!paused) { logger.info('This is a test for emitting drain event'); } } } ); batch.test( 'Should emit pause event and resume when logging in a date file with high frequency', (t) => { t.teardown(async () => { process.off( 'log4js:pause', process.listeners('log4js:pause')[ process.listeners('log4js:pause').length - 1 ] ); await removeFiles('logs/date-file-drain.log'); }); // Generate date file logger with 5kb of highWaterMark config log4js.configure({ appenders: { file: { type: 'dateFile', filename: 'logs/date-file-drain.log', highWaterMark: 5 * 1024, }, }, categories: { default: { appenders: ['file'], level: 'debug' }, }, }); let paused = false; let resumed = false; process.on('log4js:pause', (value) => { if (value) { paused = true; t.ok(value, 'log4js:pause, true'); } else { resumed = true; t.ok(!value, 'log4js:pause, false'); t.end(); } }); const logger = log4js.getLogger(); while (!paused && !resumed) { if (!paused) logger.info( 'This is a test for emitting drain event in date file logger' ); } } ); batch.teardown(async () => { try { const files = fs.readdirSync('logs'); await removeFiles(files.map((filename) => `logs/${filename}`)); fs.rmdirSync('logs'); } catch (e) { // doesn't matter } }); batch.end(); });
-1
log4js-node/log4js-node
1,332
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`
Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
lamweili
"2022-10-01T10:52:35Z"
"2022-10-01T10:57:05Z"
916eef11f1d2aa2f32765f956f1f674745feb8b6
570ef530dc02d3e843a5421cb015bb8fadfe0b41
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`. Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
./examples/memory-test.js
const log4js = require('../lib/log4js'); log4js.configure({ appenders: { logs: { type: 'file', filename: 'memory-test.log', }, console: { type: 'stdout', }, file: { type: 'file', filename: 'memory-usage.log', layout: { type: 'messagePassThrough', }, }, }, categories: { default: { appenders: ['console'], level: 'info' }, 'memory-test': { appenders: ['logs'], level: 'info' }, 'memory-usage': { appenders: ['console', 'file'], level: 'info' }, }, }); const logger = log4js.getLogger('memory-test'); const usage = log4js.getLogger('memory-usage'); for (let i = 0; i < 1000000; i += 1) { if (i % 5000 === 0) { usage.info('%d %d', i, process.memoryUsage().rss); } logger.info('Doing something.'); } log4js.shutdown(() => {});
const log4js = require('../lib/log4js'); log4js.configure({ appenders: { logs: { type: 'file', filename: 'memory-test.log', }, console: { type: 'stdout', }, file: { type: 'file', filename: 'memory-usage.log', layout: { type: 'messagePassThrough', }, }, }, categories: { default: { appenders: ['console'], level: 'info' }, 'memory-test': { appenders: ['logs'], level: 'info' }, 'memory-usage': { appenders: ['console', 'file'], level: 'info' }, }, }); const logger = log4js.getLogger('memory-test'); const usage = log4js.getLogger('memory-usage'); for (let i = 0; i < 1000000; i += 1) { if (i % 5000 === 0) { usage.info('%d %d', i, process.memoryUsage().rss); } logger.info('Doing something.'); } log4js.shutdown(() => {});
-1
log4js-node/log4js-node
1,332
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`
Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
lamweili
"2022-10-01T10:52:35Z"
"2022-10-01T10:57:05Z"
916eef11f1d2aa2f32765f956f1f674745feb8b6
570ef530dc02d3e843a5421cb015bb8fadfe0b41
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`. Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
./examples/example-connect-logger.js
// The connect/express logger was added to log4js by danbell. This allows connect/express servers to log using log4js. // https://github.com/nomiddlename/log4js-node/wiki/Connect-Logger // load modules const log4js = require('log4js'); const express = require('express'); const app = express(); // config log4js.configure({ appenders: { console: { type: 'console' }, file: { type: 'file', filename: 'logs/log4jsconnect.log' }, }, categories: { default: { appenders: ['console'], level: 'debug' }, log4jslog: { appenders: ['file'], level: 'debug' }, }, }); // define logger const logger = log4js.getLogger('log4jslog'); // set at which time msg is logged print like: only on error & above // logger.setLevel('ERROR'); // express app app.use(express.favicon('')); // app.use(log4js.connectLogger(logger, { level: log4js.levels.INFO })); // app.use(log4js.connectLogger(logger, { level: 'auto', format: ':method :url :status' })); // ### AUTO LEVEL DETECTION // http responses 3xx, level = WARN // http responses 4xx & 5xx, level = ERROR // else.level = INFO app.use(log4js.connectLogger(logger, { level: 'auto' })); // route app.get('/', (req, res) => { res.send('hello world'); }); // start app app.listen(5000); console.log('server runing at localhost:5000'); console.log('Simulation of normal response: goto localhost:5000'); console.log('Simulation of error response: goto localhost:5000/xxx');
// The connect/express logger was added to log4js by danbell. This allows connect/express servers to log using log4js. // https://github.com/nomiddlename/log4js-node/wiki/Connect-Logger // load modules const log4js = require('log4js'); const express = require('express'); const app = express(); // config log4js.configure({ appenders: { console: { type: 'console' }, file: { type: 'file', filename: 'logs/log4jsconnect.log' }, }, categories: { default: { appenders: ['console'], level: 'debug' }, log4jslog: { appenders: ['file'], level: 'debug' }, }, }); // define logger const logger = log4js.getLogger('log4jslog'); // set at which time msg is logged print like: only on error & above // logger.setLevel('ERROR'); // express app app.use(express.favicon('')); // app.use(log4js.connectLogger(logger, { level: log4js.levels.INFO })); // app.use(log4js.connectLogger(logger, { level: 'auto', format: ':method :url :status' })); // ### AUTO LEVEL DETECTION // http responses 3xx, level = WARN // http responses 4xx & 5xx, level = ERROR // else.level = INFO app.use(log4js.connectLogger(logger, { level: 'auto' })); // route app.get('/', (req, res) => { res.send('hello world'); }); // start app app.listen(5000); console.log('server runing at localhost:5000'); console.log('Simulation of normal response: goto localhost:5000'); console.log('Simulation of error response: goto localhost:5000/xxx');
-1
log4js-node/log4js-node
1,332
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`
Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
lamweili
"2022-10-01T10:52:35Z"
"2022-10-01T10:57:05Z"
916eef11f1d2aa2f32765f956f1f674745feb8b6
570ef530dc02d3e843a5421cb015bb8fadfe0b41
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`. Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
./lib/layouts.js
const dateFormat = require('date-format'); const os = require('os'); const util = require('util'); const path = require('path'); const url = require('url'); const debug = require('debug')('log4js:layouts'); const styles = { // styles bold: [1, 22], italic: [3, 23], underline: [4, 24], inverse: [7, 27], // grayscale white: [37, 39], grey: [90, 39], black: [90, 39], // colors blue: [34, 39], cyan: [36, 39], green: [32, 39], magenta: [35, 39], red: [91, 39], yellow: [33, 39], }; function colorizeStart(style) { return style ? `\x1B[${styles[style][0]}m` : ''; } function colorizeEnd(style) { return style ? `\x1B[${styles[style][1]}m` : ''; } /** * Taken from masylum's fork (https://github.com/masylum/log4js-node) */ function colorize(str, style) { return colorizeStart(style) + str + colorizeEnd(style); } function timestampLevelAndCategory(loggingEvent, colour) { return colorize( util.format( '[%s] [%s] %s - ', dateFormat.asString(loggingEvent.startTime), loggingEvent.level.toString(), loggingEvent.categoryName ), colour ); } /** * BasicLayout is a simple layout for storing the logs. The logs are stored * in following format: * <pre> * [startTime] [logLevel] categoryName - message\n * </pre> * * @author Stephan Strittmatter */ function basicLayout(loggingEvent) { return ( timestampLevelAndCategory(loggingEvent) + util.format(...loggingEvent.data) ); } /** * colouredLayout - taken from masylum's fork. * same as basicLayout, but with colours. */ function colouredLayout(loggingEvent) { return ( timestampLevelAndCategory(loggingEvent, loggingEvent.level.colour) + util.format(...loggingEvent.data) ); } function messagePassThroughLayout(loggingEvent) { return util.format(...loggingEvent.data); } function dummyLayout(loggingEvent) { return loggingEvent.data[0]; } /** * PatternLayout * Format for specifiers is %[padding].[truncation][field]{[format]} * e.g. %5.10p - left pad the log level by 5 characters, up to a max of 10 * both padding and truncation can be negative. * Negative truncation = trunc from end of string * Positive truncation = trunc from start of string * Negative padding = pad right * Positive padding = pad left * * Fields can be any of: * - %r time in toLocaleTimeString format * - %p log level * - %c log category * - %h hostname * - %m log data * - %d date in constious formats * - %% % * - %n newline * - %z pid * - %f filename * - %l line number * - %o column postion * - %s call stack * - %C class name [#1316](https://github.com/log4js-node/log4js-node/pull/1316) * - %M method or function name [#1316](https://github.com/log4js-node/log4js-node/pull/1316) * - %A method or function alias [#1316](https://github.com/log4js-node/log4js-node/pull/1316) * - %F fully qualified caller name [#1316](https://github.com/log4js-node/log4js-node/pull/1316) * - %x{<tokenname>} add dynamic tokens to your log. Tokens are specified in the tokens parameter * - %X{<tokenname>} add dynamic tokens to your log. Tokens are specified in logger context * You can use %[ and %] to define a colored block. * * Tokens are specified as simple key:value objects. * The key represents the token name whereas the value can be a string or function * which is called to extract the value to put in the log message. If token is not * found, it doesn't replace the field. * * A sample token would be: { 'pid' : function() { return process.pid; } } * * Takes a pattern string, array of tokens and returns a layout function. * @return {Function} * @param pattern * @param tokens * @param timezoneOffset * * @authors ['Stephan Strittmatter', 'Jan Schmidle'] */ function patternLayout(pattern, tokens) { const TTCC_CONVERSION_PATTERN = '%r %p %c - %m%n'; const regex = /%(-?[0-9]+)?(\.?-?[0-9]+)?([[\]cdhmnprzxXyflosCMAF%])(\{([^}]+)\})?|([^%]+)/; pattern = pattern || TTCC_CONVERSION_PATTERN; function categoryName(loggingEvent, specifier) { let loggerName = loggingEvent.categoryName; if (specifier) { const precision = parseInt(specifier, 10); const loggerNameBits = loggerName.split('.'); if (precision < loggerNameBits.length) { loggerName = loggerNameBits .slice(loggerNameBits.length - precision) .join('.'); } } return loggerName; } function formatAsDate(loggingEvent, specifier) { let format = dateFormat.ISO8601_FORMAT; if (specifier) { format = specifier; // Pick up special cases switch (format) { case 'ISO8601': case 'ISO8601_FORMAT': format = dateFormat.ISO8601_FORMAT; break; case 'ISO8601_WITH_TZ_OFFSET': case 'ISO8601_WITH_TZ_OFFSET_FORMAT': format = dateFormat.ISO8601_WITH_TZ_OFFSET_FORMAT; break; case 'ABSOLUTE': process.emitWarning( 'Pattern %d{ABSOLUTE} is deprecated in favor of %d{ABSOLUTETIME}. ' + 'Please use %d{ABSOLUTETIME} instead.', 'DeprecationWarning', 'log4js-node-DEP0003' ); debug( '[log4js-node-DEP0003]', 'DEPRECATION: Pattern %d{ABSOLUTE} is deprecated and replaced by %d{ABSOLUTETIME}.' ); // falls through case 'ABSOLUTETIME': case 'ABSOLUTETIME_FORMAT': format = dateFormat.ABSOLUTETIME_FORMAT; break; case 'DATE': process.emitWarning( 'Pattern %d{DATE} is deprecated due to the confusion it causes when used. ' + 'Please use %d{DATETIME} instead.', 'DeprecationWarning', 'log4js-node-DEP0004' ); debug( '[log4js-node-DEP0004]', 'DEPRECATION: Pattern %d{DATE} is deprecated and replaced by %d{DATETIME}.' ); // falls through case 'DATETIME': case 'DATETIME_FORMAT': format = dateFormat.DATETIME_FORMAT; break; // no default } } // Format the date return dateFormat.asString(format, loggingEvent.startTime); } function hostname() { return os.hostname().toString(); } function formatMessage(loggingEvent) { return util.format(...loggingEvent.data); } function endOfLine() { return os.EOL; } function logLevel(loggingEvent) { return loggingEvent.level.toString(); } function startTime(loggingEvent) { return dateFormat.asString('hh:mm:ss', loggingEvent.startTime); } function startColour(loggingEvent) { return colorizeStart(loggingEvent.level.colour); } function endColour(loggingEvent) { return colorizeEnd(loggingEvent.level.colour); } function percent() { return '%'; } function pid(loggingEvent) { return loggingEvent && loggingEvent.pid ? loggingEvent.pid.toString() : process.pid.toString(); } function clusterInfo() { // this used to try to return the master and worker pids, // but it would never have worked because master pid is not available to workers // leaving this here to maintain compatibility for patterns return pid(); } function userDefined(loggingEvent, specifier) { if (typeof tokens[specifier] !== 'undefined') { return typeof tokens[specifier] === 'function' ? tokens[specifier](loggingEvent) : tokens[specifier]; } return null; } function contextDefined(loggingEvent, specifier) { const resolver = loggingEvent.context[specifier]; if (typeof resolver !== 'undefined') { return typeof resolver === 'function' ? resolver(loggingEvent) : resolver; } return null; } function fileName(loggingEvent, specifier) { let filename = loggingEvent.fileName || ''; // support for ESM as it uses url instead of path for file /* istanbul ignore next: unsure how to simulate ESM for test coverage */ const convertFileURLToPath = function (filepath) { const urlPrefix = 'file://'; if (filepath.startsWith(urlPrefix)) { // https://nodejs.org/api/url.html#urlfileurltopathurl if (typeof url.fileURLToPath === 'function') { filepath = url.fileURLToPath(filepath); } // backward-compatible for nodejs pre-10.12.0 (without url.fileURLToPath method) else { // posix: file:///hello/world/foo.txt -> /hello/world/foo.txt -> /hello/world/foo.txt // win32: file:///C:/path/foo.txt -> /C:/path/foo.txt -> \C:\path\foo.txt -> C:\path\foo.txt // win32: file://nas/foo.txt -> //nas/foo.txt -> nas\foo.txt -> \\nas\foo.txt filepath = path.normalize( filepath.replace(new RegExp(`^${urlPrefix}`), '') ); if (process.platform === 'win32') { if (filepath.startsWith('\\')) { filepath = filepath.slice(1); } else { filepath = path.sep + path.sep + filepath; } } } } return filepath; }; filename = convertFileURLToPath(filename); if (specifier) { const fileDepth = parseInt(specifier, 10); const fileList = filename.split(path.sep); if (fileList.length > fileDepth) { filename = fileList.slice(-fileDepth).join(path.sep); } } return filename; } function lineNumber(loggingEvent) { return loggingEvent.lineNumber ? `${loggingEvent.lineNumber}` : ''; } function columnNumber(loggingEvent) { return loggingEvent.columnNumber ? `${loggingEvent.columnNumber}` : ''; } function callStack(loggingEvent) { return loggingEvent.callStack || ''; } function className(loggingEvent) { return loggingEvent.className || ''; } function functionName(loggingEvent) { return loggingEvent.functionName || ''; } function functionAlias(loggingEvent) { return loggingEvent.functionAlias || ''; } function callerName(loggingEvent) { return loggingEvent.callerName || ''; } const replacers = { c: categoryName, d: formatAsDate, h: hostname, m: formatMessage, n: endOfLine, p: logLevel, r: startTime, '[': startColour, ']': endColour, y: clusterInfo, z: pid, '%': percent, x: userDefined, X: contextDefined, f: fileName, l: lineNumber, o: columnNumber, s: callStack, C: className, M: functionName, A: functionAlias, F: callerName, }; function replaceToken(conversionCharacter, loggingEvent, specifier) { return replacers[conversionCharacter](loggingEvent, specifier); } function truncate(truncation, toTruncate) { let len; if (truncation) { len = parseInt(truncation.slice(1), 10); // negative truncate length means truncate from end of string return len > 0 ? toTruncate.slice(0, len) : toTruncate.slice(len); } return toTruncate; } function pad(padding, toPad) { let len; if (padding) { if (padding.charAt(0) === '-') { len = parseInt(padding.slice(1), 10); // Right pad with spaces while (toPad.length < len) { toPad += ' '; } } else { len = parseInt(padding, 10); // Left pad with spaces while (toPad.length < len) { toPad = ` ${toPad}`; } } } return toPad; } function truncateAndPad(toTruncAndPad, truncation, padding) { let replacement = toTruncAndPad; replacement = truncate(truncation, replacement); replacement = pad(padding, replacement); return replacement; } return function (loggingEvent) { let formattedString = ''; let result; let searchString = pattern; while ((result = regex.exec(searchString)) !== null) { // const matchedString = result[0]; const padding = result[1]; const truncation = result[2]; const conversionCharacter = result[3]; const specifier = result[5]; const text = result[6]; // Check if the pattern matched was just normal text if (text) { formattedString += text.toString(); } else { // Create a raw replacement string based on the conversion // character and specifier const replacement = replaceToken( conversionCharacter, loggingEvent, specifier ); formattedString += truncateAndPad(replacement, truncation, padding); } searchString = searchString.slice(result.index + result[0].length); } return formattedString; }; } const layoutMakers = { messagePassThrough() { return messagePassThroughLayout; }, basic() { return basicLayout; }, colored() { return colouredLayout; }, coloured() { return colouredLayout; }, pattern(config) { return patternLayout(config && config.pattern, config && config.tokens); }, dummy() { return dummyLayout; }, }; module.exports = { basicLayout, messagePassThroughLayout, patternLayout, colouredLayout, coloredLayout: colouredLayout, dummyLayout, addLayout(name, serializerGenerator) { layoutMakers[name] = serializerGenerator; }, layout(name, config) { return layoutMakers[name] && layoutMakers[name](config); }, };
const dateFormat = require('date-format'); const os = require('os'); const util = require('util'); const path = require('path'); const url = require('url'); const debug = require('debug')('log4js:layouts'); const styles = { // styles bold: [1, 22], italic: [3, 23], underline: [4, 24], inverse: [7, 27], // grayscale white: [37, 39], grey: [90, 39], black: [90, 39], // colors blue: [34, 39], cyan: [36, 39], green: [32, 39], magenta: [35, 39], red: [91, 39], yellow: [33, 39], }; function colorizeStart(style) { return style ? `\x1B[${styles[style][0]}m` : ''; } function colorizeEnd(style) { return style ? `\x1B[${styles[style][1]}m` : ''; } /** * Taken from masylum's fork (https://github.com/masylum/log4js-node) */ function colorize(str, style) { return colorizeStart(style) + str + colorizeEnd(style); } function timestampLevelAndCategory(loggingEvent, colour) { return colorize( util.format( '[%s] [%s] %s - ', dateFormat.asString(loggingEvent.startTime), loggingEvent.level.toString(), loggingEvent.categoryName ), colour ); } /** * BasicLayout is a simple layout for storing the logs. The logs are stored * in following format: * <pre> * [startTime] [logLevel] categoryName - message\n * </pre> * * @author Stephan Strittmatter */ function basicLayout(loggingEvent) { return ( timestampLevelAndCategory(loggingEvent) + util.format(...loggingEvent.data) ); } /** * colouredLayout - taken from masylum's fork. * same as basicLayout, but with colours. */ function colouredLayout(loggingEvent) { return ( timestampLevelAndCategory(loggingEvent, loggingEvent.level.colour) + util.format(...loggingEvent.data) ); } function messagePassThroughLayout(loggingEvent) { return util.format(...loggingEvent.data); } function dummyLayout(loggingEvent) { return loggingEvent.data[0]; } /** * PatternLayout * Format for specifiers is %[padding].[truncation][field]{[format]} * e.g. %5.10p - left pad the log level by 5 characters, up to a max of 10 * both padding and truncation can be negative. * Negative truncation = trunc from end of string * Positive truncation = trunc from start of string * Negative padding = pad right * Positive padding = pad left * * Fields can be any of: * - %r time in toLocaleTimeString format * - %p log level * - %c log category * - %h hostname * - %m log data * - %d date in constious formats * - %% % * - %n newline * - %z pid * - %f filename * - %l line number * - %o column postion * - %s call stack * - %C class name [#1316](https://github.com/log4js-node/log4js-node/pull/1316) * - %M method or function name [#1316](https://github.com/log4js-node/log4js-node/pull/1316) * - %A method or function alias [#1316](https://github.com/log4js-node/log4js-node/pull/1316) * - %F fully qualified caller name [#1316](https://github.com/log4js-node/log4js-node/pull/1316) * - %x{<tokenname>} add dynamic tokens to your log. Tokens are specified in the tokens parameter * - %X{<tokenname>} add dynamic tokens to your log. Tokens are specified in logger context * You can use %[ and %] to define a colored block. * * Tokens are specified as simple key:value objects. * The key represents the token name whereas the value can be a string or function * which is called to extract the value to put in the log message. If token is not * found, it doesn't replace the field. * * A sample token would be: { 'pid' : function() { return process.pid; } } * * Takes a pattern string, array of tokens and returns a layout function. * @return {Function} * @param pattern * @param tokens * @param timezoneOffset * * @authors ['Stephan Strittmatter', 'Jan Schmidle'] */ function patternLayout(pattern, tokens) { const TTCC_CONVERSION_PATTERN = '%r %p %c - %m%n'; const regex = /%(-?[0-9]+)?(\.?-?[0-9]+)?([[\]cdhmnprzxXyflosCMAF%])(\{([^}]+)\})?|([^%]+)/; pattern = pattern || TTCC_CONVERSION_PATTERN; function categoryName(loggingEvent, specifier) { let loggerName = loggingEvent.categoryName; if (specifier) { const precision = parseInt(specifier, 10); const loggerNameBits = loggerName.split('.'); if (precision < loggerNameBits.length) { loggerName = loggerNameBits .slice(loggerNameBits.length - precision) .join('.'); } } return loggerName; } function formatAsDate(loggingEvent, specifier) { let format = dateFormat.ISO8601_FORMAT; if (specifier) { format = specifier; // Pick up special cases switch (format) { case 'ISO8601': case 'ISO8601_FORMAT': format = dateFormat.ISO8601_FORMAT; break; case 'ISO8601_WITH_TZ_OFFSET': case 'ISO8601_WITH_TZ_OFFSET_FORMAT': format = dateFormat.ISO8601_WITH_TZ_OFFSET_FORMAT; break; case 'ABSOLUTE': process.emitWarning( 'Pattern %d{ABSOLUTE} is deprecated in favor of %d{ABSOLUTETIME}. ' + 'Please use %d{ABSOLUTETIME} instead.', 'DeprecationWarning', 'log4js-node-DEP0003' ); debug( '[log4js-node-DEP0003]', 'DEPRECATION: Pattern %d{ABSOLUTE} is deprecated and replaced by %d{ABSOLUTETIME}.' ); // falls through case 'ABSOLUTETIME': case 'ABSOLUTETIME_FORMAT': format = dateFormat.ABSOLUTETIME_FORMAT; break; case 'DATE': process.emitWarning( 'Pattern %d{DATE} is deprecated due to the confusion it causes when used. ' + 'Please use %d{DATETIME} instead.', 'DeprecationWarning', 'log4js-node-DEP0004' ); debug( '[log4js-node-DEP0004]', 'DEPRECATION: Pattern %d{DATE} is deprecated and replaced by %d{DATETIME}.' ); // falls through case 'DATETIME': case 'DATETIME_FORMAT': format = dateFormat.DATETIME_FORMAT; break; // no default } } // Format the date return dateFormat.asString(format, loggingEvent.startTime); } function hostname() { return os.hostname().toString(); } function formatMessage(loggingEvent) { return util.format(...loggingEvent.data); } function endOfLine() { return os.EOL; } function logLevel(loggingEvent) { return loggingEvent.level.toString(); } function startTime(loggingEvent) { return dateFormat.asString('hh:mm:ss', loggingEvent.startTime); } function startColour(loggingEvent) { return colorizeStart(loggingEvent.level.colour); } function endColour(loggingEvent) { return colorizeEnd(loggingEvent.level.colour); } function percent() { return '%'; } function pid(loggingEvent) { return loggingEvent && loggingEvent.pid ? loggingEvent.pid.toString() : process.pid.toString(); } function clusterInfo() { // this used to try to return the master and worker pids, // but it would never have worked because master pid is not available to workers // leaving this here to maintain compatibility for patterns return pid(); } function userDefined(loggingEvent, specifier) { if (typeof tokens[specifier] !== 'undefined') { return typeof tokens[specifier] === 'function' ? tokens[specifier](loggingEvent) : tokens[specifier]; } return null; } function contextDefined(loggingEvent, specifier) { const resolver = loggingEvent.context[specifier]; if (typeof resolver !== 'undefined') { return typeof resolver === 'function' ? resolver(loggingEvent) : resolver; } return null; } function fileName(loggingEvent, specifier) { let filename = loggingEvent.fileName || ''; // support for ESM as it uses url instead of path for file /* istanbul ignore next: unsure how to simulate ESM for test coverage */ const convertFileURLToPath = function (filepath) { const urlPrefix = 'file://'; if (filepath.startsWith(urlPrefix)) { // https://nodejs.org/api/url.html#urlfileurltopathurl if (typeof url.fileURLToPath === 'function') { filepath = url.fileURLToPath(filepath); } // backward-compatible for nodejs pre-10.12.0 (without url.fileURLToPath method) else { // posix: file:///hello/world/foo.txt -> /hello/world/foo.txt -> /hello/world/foo.txt // win32: file:///C:/path/foo.txt -> /C:/path/foo.txt -> \C:\path\foo.txt -> C:\path\foo.txt // win32: file://nas/foo.txt -> //nas/foo.txt -> nas\foo.txt -> \\nas\foo.txt filepath = path.normalize( filepath.replace(new RegExp(`^${urlPrefix}`), '') ); if (process.platform === 'win32') { if (filepath.startsWith('\\')) { filepath = filepath.slice(1); } else { filepath = path.sep + path.sep + filepath; } } } } return filepath; }; filename = convertFileURLToPath(filename); if (specifier) { const fileDepth = parseInt(specifier, 10); const fileList = filename.split(path.sep); if (fileList.length > fileDepth) { filename = fileList.slice(-fileDepth).join(path.sep); } } return filename; } function lineNumber(loggingEvent) { return loggingEvent.lineNumber ? `${loggingEvent.lineNumber}` : ''; } function columnNumber(loggingEvent) { return loggingEvent.columnNumber ? `${loggingEvent.columnNumber}` : ''; } function callStack(loggingEvent) { return loggingEvent.callStack || ''; } function className(loggingEvent) { return loggingEvent.className || ''; } function functionName(loggingEvent) { return loggingEvent.functionName || ''; } function functionAlias(loggingEvent) { return loggingEvent.functionAlias || ''; } function callerName(loggingEvent) { return loggingEvent.callerName || ''; } const replacers = { c: categoryName, d: formatAsDate, h: hostname, m: formatMessage, n: endOfLine, p: logLevel, r: startTime, '[': startColour, ']': endColour, y: clusterInfo, z: pid, '%': percent, x: userDefined, X: contextDefined, f: fileName, l: lineNumber, o: columnNumber, s: callStack, C: className, M: functionName, A: functionAlias, F: callerName, }; function replaceToken(conversionCharacter, loggingEvent, specifier) { return replacers[conversionCharacter](loggingEvent, specifier); } function truncate(truncation, toTruncate) { let len; if (truncation) { len = parseInt(truncation.slice(1), 10); // negative truncate length means truncate from end of string return len > 0 ? toTruncate.slice(0, len) : toTruncate.slice(len); } return toTruncate; } function pad(padding, toPad) { let len; if (padding) { if (padding.charAt(0) === '-') { len = parseInt(padding.slice(1), 10); // Right pad with spaces while (toPad.length < len) { toPad += ' '; } } else { len = parseInt(padding, 10); // Left pad with spaces while (toPad.length < len) { toPad = ` ${toPad}`; } } } return toPad; } function truncateAndPad(toTruncAndPad, truncation, padding) { let replacement = toTruncAndPad; replacement = truncate(truncation, replacement); replacement = pad(padding, replacement); return replacement; } return function (loggingEvent) { let formattedString = ''; let result; let searchString = pattern; while ((result = regex.exec(searchString)) !== null) { // const matchedString = result[0]; const padding = result[1]; const truncation = result[2]; const conversionCharacter = result[3]; const specifier = result[5]; const text = result[6]; // Check if the pattern matched was just normal text if (text) { formattedString += text.toString(); } else { // Create a raw replacement string based on the conversion // character and specifier const replacement = replaceToken( conversionCharacter, loggingEvent, specifier ); formattedString += truncateAndPad(replacement, truncation, padding); } searchString = searchString.slice(result.index + result[0].length); } return formattedString; }; } const layoutMakers = { messagePassThrough() { return messagePassThroughLayout; }, basic() { return basicLayout; }, colored() { return colouredLayout; }, coloured() { return colouredLayout; }, pattern(config) { return patternLayout(config && config.pattern, config && config.tokens); }, dummy() { return dummyLayout; }, }; module.exports = { basicLayout, messagePassThroughLayout, patternLayout, colouredLayout, coloredLayout: colouredLayout, dummyLayout, addLayout(name, serializerGenerator) { layoutMakers[name] = serializerGenerator; }, layout(name, config) { return layoutMakers[name] && layoutMakers[name](config); }, };
-1
log4js-node/log4js-node
1,332
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`
Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
lamweili
"2022-10-01T10:52:35Z"
"2022-10-01T10:57:05Z"
916eef11f1d2aa2f32765f956f1f674745feb8b6
570ef530dc02d3e843a5421cb015bb8fadfe0b41
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`. Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
./test/tap/fileAppender-test.js
/* eslint max-classes-per-file: ["error", 2] */ const { test } = require('tap'); const fs = require('fs-extra'); const path = require('path'); const sandbox = require('@log4js-node/sandboxed-module'); const zlib = require('zlib'); const util = require('util'); const osDelay = process.platform === 'win32' ? 400 : 200; const sleep = util.promisify(setTimeout); const gunzip = util.promisify(zlib.gunzip); const EOL = require('os').EOL || '\n'; const log4js = require('../../lib/log4js'); const removeFile = async (filename) => { try { await fs.unlink(filename); } catch (e) { // let's pretend this never happened } }; test('log4js fileAppender', (batch) => { batch.test('with default fileAppender settings', async (t) => { const testFile = path.join(__dirname, 'fa-default-test.log'); const logger = log4js.getLogger('default-settings'); await removeFile(testFile); t.teardown(async () => { await new Promise((resolve) => { log4js.shutdown(resolve); }); await removeFile(testFile); }); log4js.configure({ appenders: { file: { type: 'file', filename: testFile } }, categories: { default: { appenders: ['file'], level: 'debug' } }, }); logger.info('This should be in the file.'); await sleep(osDelay); const fileContents = await fs.readFile(testFile, 'utf8'); t.match(fileContents, `This should be in the file.${EOL}`); t.match( fileContents, /\[\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}\.\d{3}] \[INFO] default-settings - / ); t.end(); }); batch.test('should give error if invalid filename', async (t) => { const file = ''; t.throws( () => log4js.configure({ appenders: { file: { type: 'file', filename: file, }, }, categories: { default: { appenders: ['file'], level: 'debug' }, }, }), new Error(`Invalid filename: ${file}`) ); const dir = `.${path.sep}`; t.throws( () => log4js.configure({ appenders: { file: { type: 'file', filename: dir, }, }, categories: { default: { appenders: ['file'], level: 'debug' }, }, }), new Error(`Filename is a directory: ${dir}`) ); t.end(); }); batch.test('should flush logs on shutdown', async (t) => { const testFile = path.join(__dirname, 'fa-default-test.log'); const logger = log4js.getLogger('default-settings'); await removeFile(testFile); t.teardown(async () => { await new Promise((resolve) => { log4js.shutdown(resolve); }); await removeFile(testFile); }); log4js.configure({ appenders: { test: { type: 'file', filename: testFile } }, categories: { default: { appenders: ['test'], level: 'trace' } }, }); logger.info('1'); logger.info('2'); logger.info('3'); await new Promise((resolve) => { log4js.shutdown(resolve); }); const fileContents = await fs.readFile(testFile, 'utf8'); // 3 lines of output, plus the trailing newline. t.equal(fileContents.split(EOL).length, 4); t.match( fileContents, /\[\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}\.\d{3}] \[INFO] default-settings - / ); t.end(); }); batch.test('with a max file size and no backups', async (t) => { const testFile = path.join(__dirname, 'fa-maxFileSize-test.log'); const logger = log4js.getLogger('max-file-size'); await removeFile(testFile); t.teardown(async () => { await new Promise((resolve) => { log4js.shutdown(resolve); }); await removeFile(testFile); }); // log file of 100 bytes maximum, no backups log4js.configure({ appenders: { file: { type: 'file', filename: testFile, maxLogSize: 100, backups: 0, }, }, categories: { default: { appenders: ['file'], level: 'debug' }, }, }); logger.info('This is the first log message.'); logger.info('This is an intermediate log message.'); logger.info('This is the second log message.'); // wait for the file system to catch up await sleep(osDelay * 2); const fileContents = await fs.readFile(testFile, 'utf8'); t.match(fileContents, 'This is the second log message.'); t.equal(fileContents.indexOf('This is the first log message.'), -1); const files = await fs.readdir(__dirname); const logFiles = files.filter((file) => file.includes('fa-maxFileSize-test.log') ); t.equal(logFiles.length, 1, 'should be 1 file'); t.end(); }); batch.test('with a max file size in wrong unit mode', async (t) => { const invalidUnit = '1Z'; const expectedError = new Error(`maxLogSize: "${invalidUnit}" is invalid`); t.throws( () => log4js.configure({ appenders: { file: { type: 'file', maxLogSize: invalidUnit, }, }, categories: { default: { appenders: ['file'], level: 'debug' }, }, }), expectedError ); t.end(); }); batch.test('with a max file size in unit mode and no backups', async (t) => { const testFile = path.join(__dirname, 'fa-maxFileSize-unit-test.log'); const logger = log4js.getLogger('max-file-size-unit'); await Promise.all([removeFile(testFile), removeFile(`${testFile}.1`)]); t.teardown(async () => { await new Promise((resolve) => { log4js.shutdown(resolve); }); await Promise.all([removeFile(testFile), removeFile(`${testFile}.1`)]); }); // log file of 1K = 1024 bytes maximum, no backups log4js.configure({ appenders: { file: { type: 'file', filename: testFile, maxLogSize: '1K', backups: 0, layout: { type: 'messagePassThrough' }, }, }, categories: { default: { appenders: ['file'], level: 'debug' }, }, }); const maxLine = 22; // 1024 max file size / 47 bytes per line for (let i = 0; i < maxLine; i++) { logger.info('These are the log messages for the first file.'); // 46 bytes per line + '\n' } logger.info('This is the second log message.'); // wait for the file system to catch up await sleep(osDelay); const fileContents = await fs.readFile(testFile, 'utf8'); t.match(fileContents, 'This is the second log message.'); t.notMatch(fileContents, 'These are the log messages for the first file.'); const files = await fs.readdir(__dirname); const logFiles = files.filter((file) => file.includes('fa-maxFileSize-unit-test.log') ); t.equal(logFiles.length, 1, 'should be 1 file'); t.end(); }); batch.test('with a max file size and 2 backups', async (t) => { const testFile = path.join( __dirname, 'fa-maxFileSize-with-backups-test.log' ); const logger = log4js.getLogger('max-file-size-backups'); await Promise.all([ removeFile(testFile), removeFile(`${testFile}.1`), removeFile(`${testFile}.2`), ]); t.teardown(async () => { await new Promise((resolve) => { log4js.shutdown(resolve); }); await Promise.all([ removeFile(testFile), removeFile(`${testFile}.1`), removeFile(`${testFile}.2`), ]); }); // log file of 50 bytes maximum, 2 backups log4js.configure({ appenders: { file: { type: 'file', filename: testFile, maxLogSize: 50, backups: 2, }, }, categories: { default: { appenders: ['file'], level: 'debug' } }, }); logger.info('This is the first log message.'); logger.info('This is the second log message.'); logger.info('This is the third log message.'); logger.info('This is the fourth log message.'); // give the system a chance to open the stream await sleep(osDelay); const files = await fs.readdir(__dirname); const logFiles = files .sort() .filter((file) => file.includes('fa-maxFileSize-with-backups-test.log')); t.equal(logFiles.length, 3); t.same(logFiles, [ 'fa-maxFileSize-with-backups-test.log', 'fa-maxFileSize-with-backups-test.log.1', 'fa-maxFileSize-with-backups-test.log.2', ]); let contents = await fs.readFile(path.join(__dirname, logFiles[0]), 'utf8'); t.match(contents, 'This is the fourth log message.'); contents = await fs.readFile(path.join(__dirname, logFiles[1]), 'utf8'); t.match(contents, 'This is the third log message.'); contents = await fs.readFile(path.join(__dirname, logFiles[2]), 'utf8'); t.match(contents, 'This is the second log message.'); t.end(); }); batch.test('with a max file size and 2 compressed backups', async (t) => { const testFile = path.join( __dirname, 'fa-maxFileSize-with-backups-compressed-test.log' ); const logger = log4js.getLogger('max-file-size-backups'); await Promise.all([ removeFile(testFile), removeFile(`${testFile}.1.gz`), removeFile(`${testFile}.2.gz`), ]); t.teardown(async () => { await new Promise((resolve) => { log4js.shutdown(resolve); }); await Promise.all([ removeFile(testFile), removeFile(`${testFile}.1.gz`), removeFile(`${testFile}.2.gz`), ]); }); // log file of 50 bytes maximum, 2 backups log4js.configure({ appenders: { file: { type: 'file', filename: testFile, maxLogSize: 50, backups: 2, compress: true, }, }, categories: { default: { appenders: ['file'], level: 'debug' } }, }); logger.info('This is the first log message.'); logger.info('This is the second log message.'); logger.info('This is the third log message.'); logger.info('This is the fourth log message.'); // give the system a chance to open the stream await sleep(osDelay); const files = await fs.readdir(__dirname); const logFiles = files .sort() .filter((file) => file.includes('fa-maxFileSize-with-backups-compressed-test.log') ); t.equal(logFiles.length, 3, 'should be 3 files'); t.same(logFiles, [ 'fa-maxFileSize-with-backups-compressed-test.log', 'fa-maxFileSize-with-backups-compressed-test.log.1.gz', 'fa-maxFileSize-with-backups-compressed-test.log.2.gz', ]); let contents = await fs.readFile(path.join(__dirname, logFiles[0]), 'utf8'); t.match(contents, 'This is the fourth log message.'); contents = await gunzip( await fs.readFile(path.join(__dirname, logFiles[1])) ); t.match(contents.toString('utf8'), 'This is the third log message.'); contents = await gunzip( await fs.readFile(path.join(__dirname, logFiles[2])) ); t.match(contents.toString('utf8'), 'This is the second log message.'); t.end(); }); batch.test('handling of writer.writable', (t) => { const output = []; let writable = true; const RollingFileStream = class { write(loggingEvent) { output.push(loggingEvent); this.written = true; return true; } // eslint-disable-next-line class-methods-use-this on() {} // eslint-disable-next-line class-methods-use-this get writable() { return writable; } }; const fileAppender = sandbox.require('../../lib/appenders/file', { requires: { streamroller: { RollingFileStream, }, }, }); const appender = fileAppender.configure( { filename: 'test1.log', maxLogSize: 100 }, { basicLayout(loggingEvent) { return loggingEvent.data; }, } ); t.test('should log when writer.writable=true', (assert) => { writable = true; appender({ data: 'something to log' }); assert.ok(output.length, 1); assert.match(output[output.length - 1], 'something to log'); assert.end(); }); t.test('should not log when writer.writable=false', (assert) => { writable = false; appender({ data: 'this should not be logged' }); assert.ok(output.length, 1); assert.notMatch(output[output.length - 1], 'this should not be logged'); assert.end(); }); t.end(); }); batch.test('when underlying stream errors', (t) => { let consoleArgs; let errorHandler; const RollingFileStream = class { end() { this.ended = true; } on(evt, cb) { if (evt === 'error') { this.errored = true; errorHandler = cb; } } write() { this.written = true; return true; } }; const fileAppender = sandbox.require('../../lib/appenders/file', { globals: { console: { error(...args) { consoleArgs = args; }, }, }, requires: { streamroller: { RollingFileStream, }, }, }); fileAppender.configure( { filename: 'test1.log', maxLogSize: 100 }, { basicLayout() {} } ); errorHandler({ error: 'aargh' }); t.test('should log the error to console.error', (assert) => { assert.ok(consoleArgs); assert.equal( consoleArgs[0], 'log4js.fileAppender - Writing to file %s, error happened ' ); assert.equal(consoleArgs[1], 'test1.log'); assert.equal(consoleArgs[2].error, 'aargh'); assert.end(); }); t.end(); }); batch.test('with removeColor fileAppender settings', async (t) => { const testFilePlain = path.join(__dirname, 'fa-removeColor-test.log'); const testFileAsIs = path.join(__dirname, 'fa-asIs-test.log'); const logger = log4js.getLogger('default-settings'); await removeFile(testFilePlain); await removeFile(testFileAsIs); t.teardown(async () => { await new Promise((resolve) => { log4js.shutdown(resolve); }); await removeFile(testFilePlain); await removeFile(testFileAsIs); }); log4js.configure({ appenders: { plainFile: { type: 'file', filename: testFilePlain, removeColor: true }, asIsFile: { type: 'file', filename: testFileAsIs, removeColor: false }, }, categories: { default: { appenders: ['plainFile', 'asIsFile'], level: 'debug' }, }, }); logger.info( 'This should be in the file.', '\x1b[33mColor\x1b[0m \x1b[93;41mshould\x1b[0m be \x1b[38;5;8mplain\x1b[0m.', {}, [] ); await sleep(osDelay); let fileContents = await fs.readFile(testFilePlain, 'utf8'); t.match( fileContents, `This should be in the file. Color should be plain. {} []${EOL}` ); t.match( fileContents, /\[\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}\.\d{3}] \[INFO] default-settings - / ); fileContents = await fs.readFile(testFileAsIs, 'utf8'); t.match( fileContents, 'This should be in the file.', `\x1b[33mColor\x1b[0m \x1b[93;41mshould\x1b[0m be \x1b[38;5;8mplain\x1b[0m. {} []${EOL}` ); t.match( fileContents, /\[\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}\.\d{3}] \[INFO] default-settings - / ); t.end(); }); batch.end(); });
/* eslint max-classes-per-file: ["error", 2] */ const { test } = require('tap'); const fs = require('fs-extra'); const path = require('path'); const sandbox = require('@log4js-node/sandboxed-module'); const zlib = require('zlib'); const util = require('util'); const osDelay = process.platform === 'win32' ? 400 : 200; const sleep = util.promisify(setTimeout); const gunzip = util.promisify(zlib.gunzip); const EOL = require('os').EOL || '\n'; const log4js = require('../../lib/log4js'); const removeFile = async (filename) => { try { await fs.unlink(filename); } catch (e) { // let's pretend this never happened } }; test('log4js fileAppender', (batch) => { batch.test('with default fileAppender settings', async (t) => { const testFile = path.join(__dirname, 'fa-default-test.log'); const logger = log4js.getLogger('default-settings'); await removeFile(testFile); t.teardown(async () => { await new Promise((resolve) => { log4js.shutdown(resolve); }); await removeFile(testFile); }); log4js.configure({ appenders: { file: { type: 'file', filename: testFile } }, categories: { default: { appenders: ['file'], level: 'debug' } }, }); logger.info('This should be in the file.'); await sleep(osDelay); const fileContents = await fs.readFile(testFile, 'utf8'); t.match(fileContents, `This should be in the file.${EOL}`); t.match( fileContents, /\[\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}\.\d{3}] \[INFO] default-settings - / ); t.end(); }); batch.test('should give error if invalid filename', async (t) => { const file = ''; t.throws( () => log4js.configure({ appenders: { file: { type: 'file', filename: file, }, }, categories: { default: { appenders: ['file'], level: 'debug' }, }, }), new Error(`Invalid filename: ${file}`) ); const dir = `.${path.sep}`; t.throws( () => log4js.configure({ appenders: { file: { type: 'file', filename: dir, }, }, categories: { default: { appenders: ['file'], level: 'debug' }, }, }), new Error(`Filename is a directory: ${dir}`) ); t.end(); }); batch.test('should flush logs on shutdown', async (t) => { const testFile = path.join(__dirname, 'fa-default-test.log'); const logger = log4js.getLogger('default-settings'); await removeFile(testFile); t.teardown(async () => { await new Promise((resolve) => { log4js.shutdown(resolve); }); await removeFile(testFile); }); log4js.configure({ appenders: { test: { type: 'file', filename: testFile } }, categories: { default: { appenders: ['test'], level: 'trace' } }, }); logger.info('1'); logger.info('2'); logger.info('3'); await new Promise((resolve) => { log4js.shutdown(resolve); }); const fileContents = await fs.readFile(testFile, 'utf8'); // 3 lines of output, plus the trailing newline. t.equal(fileContents.split(EOL).length, 4); t.match( fileContents, /\[\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}\.\d{3}] \[INFO] default-settings - / ); t.end(); }); batch.test('with a max file size and no backups', async (t) => { const testFile = path.join(__dirname, 'fa-maxFileSize-test.log'); const logger = log4js.getLogger('max-file-size'); await removeFile(testFile); t.teardown(async () => { await new Promise((resolve) => { log4js.shutdown(resolve); }); await removeFile(testFile); }); // log file of 100 bytes maximum, no backups log4js.configure({ appenders: { file: { type: 'file', filename: testFile, maxLogSize: 100, backups: 0, }, }, categories: { default: { appenders: ['file'], level: 'debug' }, }, }); logger.info('This is the first log message.'); logger.info('This is an intermediate log message.'); logger.info('This is the second log message.'); // wait for the file system to catch up await sleep(osDelay * 2); const fileContents = await fs.readFile(testFile, 'utf8'); t.match(fileContents, 'This is the second log message.'); t.equal(fileContents.indexOf('This is the first log message.'), -1); const files = await fs.readdir(__dirname); const logFiles = files.filter((file) => file.includes('fa-maxFileSize-test.log') ); t.equal(logFiles.length, 1, 'should be 1 file'); t.end(); }); batch.test('with a max file size in wrong unit mode', async (t) => { const invalidUnit = '1Z'; const expectedError = new Error(`maxLogSize: "${invalidUnit}" is invalid`); t.throws( () => log4js.configure({ appenders: { file: { type: 'file', maxLogSize: invalidUnit, }, }, categories: { default: { appenders: ['file'], level: 'debug' }, }, }), expectedError ); t.end(); }); batch.test('with a max file size in unit mode and no backups', async (t) => { const testFile = path.join(__dirname, 'fa-maxFileSize-unit-test.log'); const logger = log4js.getLogger('max-file-size-unit'); await Promise.all([removeFile(testFile), removeFile(`${testFile}.1`)]); t.teardown(async () => { await new Promise((resolve) => { log4js.shutdown(resolve); }); await Promise.all([removeFile(testFile), removeFile(`${testFile}.1`)]); }); // log file of 1K = 1024 bytes maximum, no backups log4js.configure({ appenders: { file: { type: 'file', filename: testFile, maxLogSize: '1K', backups: 0, layout: { type: 'messagePassThrough' }, }, }, categories: { default: { appenders: ['file'], level: 'debug' }, }, }); const maxLine = 22; // 1024 max file size / 47 bytes per line for (let i = 0; i < maxLine; i++) { logger.info('These are the log messages for the first file.'); // 46 bytes per line + '\n' } logger.info('This is the second log message.'); // wait for the file system to catch up await sleep(osDelay); const fileContents = await fs.readFile(testFile, 'utf8'); t.match(fileContents, 'This is the second log message.'); t.notMatch(fileContents, 'These are the log messages for the first file.'); const files = await fs.readdir(__dirname); const logFiles = files.filter((file) => file.includes('fa-maxFileSize-unit-test.log') ); t.equal(logFiles.length, 1, 'should be 1 file'); t.end(); }); batch.test('with a max file size and 2 backups', async (t) => { const testFile = path.join( __dirname, 'fa-maxFileSize-with-backups-test.log' ); const logger = log4js.getLogger('max-file-size-backups'); await Promise.all([ removeFile(testFile), removeFile(`${testFile}.1`), removeFile(`${testFile}.2`), ]); t.teardown(async () => { await new Promise((resolve) => { log4js.shutdown(resolve); }); await Promise.all([ removeFile(testFile), removeFile(`${testFile}.1`), removeFile(`${testFile}.2`), ]); }); // log file of 50 bytes maximum, 2 backups log4js.configure({ appenders: { file: { type: 'file', filename: testFile, maxLogSize: 50, backups: 2, }, }, categories: { default: { appenders: ['file'], level: 'debug' } }, }); logger.info('This is the first log message.'); logger.info('This is the second log message.'); logger.info('This is the third log message.'); logger.info('This is the fourth log message.'); // give the system a chance to open the stream await sleep(osDelay); const files = await fs.readdir(__dirname); const logFiles = files .sort() .filter((file) => file.includes('fa-maxFileSize-with-backups-test.log')); t.equal(logFiles.length, 3); t.same(logFiles, [ 'fa-maxFileSize-with-backups-test.log', 'fa-maxFileSize-with-backups-test.log.1', 'fa-maxFileSize-with-backups-test.log.2', ]); let contents = await fs.readFile(path.join(__dirname, logFiles[0]), 'utf8'); t.match(contents, 'This is the fourth log message.'); contents = await fs.readFile(path.join(__dirname, logFiles[1]), 'utf8'); t.match(contents, 'This is the third log message.'); contents = await fs.readFile(path.join(__dirname, logFiles[2]), 'utf8'); t.match(contents, 'This is the second log message.'); t.end(); }); batch.test('with a max file size and 2 compressed backups', async (t) => { const testFile = path.join( __dirname, 'fa-maxFileSize-with-backups-compressed-test.log' ); const logger = log4js.getLogger('max-file-size-backups'); await Promise.all([ removeFile(testFile), removeFile(`${testFile}.1.gz`), removeFile(`${testFile}.2.gz`), ]); t.teardown(async () => { await new Promise((resolve) => { log4js.shutdown(resolve); }); await Promise.all([ removeFile(testFile), removeFile(`${testFile}.1.gz`), removeFile(`${testFile}.2.gz`), ]); }); // log file of 50 bytes maximum, 2 backups log4js.configure({ appenders: { file: { type: 'file', filename: testFile, maxLogSize: 50, backups: 2, compress: true, }, }, categories: { default: { appenders: ['file'], level: 'debug' } }, }); logger.info('This is the first log message.'); logger.info('This is the second log message.'); logger.info('This is the third log message.'); logger.info('This is the fourth log message.'); // give the system a chance to open the stream await sleep(osDelay); const files = await fs.readdir(__dirname); const logFiles = files .sort() .filter((file) => file.includes('fa-maxFileSize-with-backups-compressed-test.log') ); t.equal(logFiles.length, 3, 'should be 3 files'); t.same(logFiles, [ 'fa-maxFileSize-with-backups-compressed-test.log', 'fa-maxFileSize-with-backups-compressed-test.log.1.gz', 'fa-maxFileSize-with-backups-compressed-test.log.2.gz', ]); let contents = await fs.readFile(path.join(__dirname, logFiles[0]), 'utf8'); t.match(contents, 'This is the fourth log message.'); contents = await gunzip( await fs.readFile(path.join(__dirname, logFiles[1])) ); t.match(contents.toString('utf8'), 'This is the third log message.'); contents = await gunzip( await fs.readFile(path.join(__dirname, logFiles[2])) ); t.match(contents.toString('utf8'), 'This is the second log message.'); t.end(); }); batch.test('handling of writer.writable', (t) => { const output = []; let writable = true; const RollingFileStream = class { write(loggingEvent) { output.push(loggingEvent); this.written = true; return true; } // eslint-disable-next-line class-methods-use-this on() {} // eslint-disable-next-line class-methods-use-this get writable() { return writable; } }; const fileAppender = sandbox.require('../../lib/appenders/file', { requires: { streamroller: { RollingFileStream, }, }, }); const appender = fileAppender.configure( { filename: 'test1.log', maxLogSize: 100 }, { basicLayout(loggingEvent) { return loggingEvent.data; }, } ); t.test('should log when writer.writable=true', (assert) => { writable = true; appender({ data: 'something to log' }); assert.ok(output.length, 1); assert.match(output[output.length - 1], 'something to log'); assert.end(); }); t.test('should not log when writer.writable=false', (assert) => { writable = false; appender({ data: 'this should not be logged' }); assert.ok(output.length, 1); assert.notMatch(output[output.length - 1], 'this should not be logged'); assert.end(); }); t.end(); }); batch.test('when underlying stream errors', (t) => { let consoleArgs; let errorHandler; const RollingFileStream = class { end() { this.ended = true; } on(evt, cb) { if (evt === 'error') { this.errored = true; errorHandler = cb; } } write() { this.written = true; return true; } }; const fileAppender = sandbox.require('../../lib/appenders/file', { globals: { console: { error(...args) { consoleArgs = args; }, }, }, requires: { streamroller: { RollingFileStream, }, }, }); fileAppender.configure( { filename: 'test1.log', maxLogSize: 100 }, { basicLayout() {} } ); errorHandler({ error: 'aargh' }); t.test('should log the error to console.error', (assert) => { assert.ok(consoleArgs); assert.equal( consoleArgs[0], 'log4js.fileAppender - Writing to file %s, error happened ' ); assert.equal(consoleArgs[1], 'test1.log'); assert.equal(consoleArgs[2].error, 'aargh'); assert.end(); }); t.end(); }); batch.test('with removeColor fileAppender settings', async (t) => { const testFilePlain = path.join(__dirname, 'fa-removeColor-test.log'); const testFileAsIs = path.join(__dirname, 'fa-asIs-test.log'); const logger = log4js.getLogger('default-settings'); await removeFile(testFilePlain); await removeFile(testFileAsIs); t.teardown(async () => { await new Promise((resolve) => { log4js.shutdown(resolve); }); await removeFile(testFilePlain); await removeFile(testFileAsIs); }); log4js.configure({ appenders: { plainFile: { type: 'file', filename: testFilePlain, removeColor: true }, asIsFile: { type: 'file', filename: testFileAsIs, removeColor: false }, }, categories: { default: { appenders: ['plainFile', 'asIsFile'], level: 'debug' }, }, }); logger.info( 'This should be in the file.', '\x1b[33mColor\x1b[0m \x1b[93;41mshould\x1b[0m be \x1b[38;5;8mplain\x1b[0m.', {}, [] ); await sleep(osDelay); let fileContents = await fs.readFile(testFilePlain, 'utf8'); t.match( fileContents, `This should be in the file. Color should be plain. {} []${EOL}` ); t.match( fileContents, /\[\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}\.\d{3}] \[INFO] default-settings - / ); fileContents = await fs.readFile(testFileAsIs, 'utf8'); t.match( fileContents, 'This should be in the file.', `\x1b[33mColor\x1b[0m \x1b[93;41mshould\x1b[0m be \x1b[38;5;8mplain\x1b[0m. {} []${EOL}` ); t.match( fileContents, /\[\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}\.\d{3}] \[INFO] default-settings - / ); t.end(); }); batch.end(); });
-1
log4js-node/log4js-node
1,332
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`
Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
lamweili
"2022-10-01T10:52:35Z"
"2022-10-01T10:57:05Z"
916eef11f1d2aa2f32765f956f1f674745feb8b6
570ef530dc02d3e843a5421cb015bb8fadfe0b41
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`. Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
./test/tap/tcp-appender-test.js
const { test } = require('tap'); const net = require('net'); const flatted = require('flatted'); const sandbox = require('@log4js-node/sandboxed-module'); const log4js = require('../../lib/log4js'); const LoggingEvent = require('../../lib/LoggingEvent'); let messages = []; let server = null; function makeServer(config) { server = net.createServer((socket) => { socket.setEncoding('utf8'); socket.on('data', (data) => { data .split(config.endMsg) .filter((s) => s.length) .forEach((s) => { messages.push(config.deserialise(s)); }); }); }); server.unref(); return server; } function makeFakeNet() { return { data: [], cbs: {}, createConnectionCalled: 0, createConnection(port, host) { const fakeNet = this; this.port = port; this.host = host; this.createConnectionCalled += 1; return { on(evt, cb) { fakeNet.cbs[evt] = cb; }, write(data, encoding) { fakeNet.data.push(data); fakeNet.encoding = encoding; return false; }, end() { fakeNet.closeCalled = true; }, }; }, createServer(cb) { const fakeNet = this; cb({ remoteAddress: '1.2.3.4', remotePort: '1234', setEncoding(encoding) { fakeNet.encoding = encoding; }, on(event, cb2) { fakeNet.cbs[event] = cb2; }, }); return { listen(port, host) { fakeNet.port = port; fakeNet.host = host; }, }; }, }; } test('TCP Appender', (batch) => { batch.test('Default Configuration', (t) => { messages = []; const serverConfig = { endMsg: '__LOG4JS__', deserialise: (log) => LoggingEvent.deserialise(log), }; server = makeServer(serverConfig); server.listen(() => { const { port } = server.address(); log4js.configure({ appenders: { default: { type: 'tcp', port }, }, categories: { default: { appenders: ['default'], level: 'debug' }, }, }); const logger = log4js.getLogger(); logger.info('This should be sent via TCP.'); logger.info('This should also be sent via TCP and not break things.'); log4js.shutdown(() => { server.close(() => { t.equal(messages.length, 2); t.match(messages[0], { data: ['This should be sent via TCP.'], categoryName: 'default', context: {}, level: { levelStr: 'INFO' }, }); t.match(messages[1], { data: ['This should also be sent via TCP and not break things.'], categoryName: 'default', context: {}, level: { levelStr: 'INFO' }, }); t.end(); }); }); }); }); batch.test('Custom EndMessage String', (t) => { messages = []; const serverConfig = { endMsg: '\n', deserialise: (log) => LoggingEvent.deserialise(log), }; server = makeServer(serverConfig); server.listen(() => { const { port } = server.address(); log4js.configure({ appenders: { customEndMsg: { type: 'tcp', port, endMsg: '\n' }, }, categories: { default: { appenders: ['customEndMsg'], level: 'debug' }, }, }); const logger = log4js.getLogger(); logger.info('This should be sent via TCP using a custom EndMsg string.'); logger.info( 'This should also be sent via TCP using a custom EndMsg string and not break things.' ); log4js.shutdown(() => { server.close(() => { t.equal(messages.length, 2); t.match(messages[0], { data: ['This should be sent via TCP using a custom EndMsg string.'], categoryName: 'default', context: {}, level: { levelStr: 'INFO' }, }); t.match(messages[1], { data: [ 'This should also be sent via TCP using a custom EndMsg string and not break things.', ], categoryName: 'default', context: {}, level: { levelStr: 'INFO' }, }); t.end(); }); }); }); }); batch.test('Custom Layout', (t) => { messages = []; const serverConfig = { endMsg: '__LOG4JS__', deserialise: (log) => JSON.parse(log), }; server = makeServer(serverConfig); log4js.addLayout( 'json', () => function (logEvent) { return JSON.stringify({ time: logEvent.startTime, message: logEvent.data[0], level: logEvent.level.toString(), }); } ); server.listen(() => { const { port } = server.address(); log4js.configure({ appenders: { customLayout: { type: 'tcp', port, layout: { type: 'json' }, }, }, categories: { default: { appenders: ['customLayout'], level: 'debug' }, }, }); const logger = log4js.getLogger(); logger.info('This should be sent as a customized json.'); logger.info( 'This should also be sent via TCP as a customized json and not break things.' ); log4js.shutdown(() => { server.close(() => { t.equal(messages.length, 2); t.match(messages[0], { message: 'This should be sent as a customized json.', level: 'INFO', }); t.match(messages[1], { message: 'This should also be sent via TCP as a customized json and not break things.', level: 'INFO', }); t.end(); }); }); }); }); batch.test('when underlying stream errors', (t) => { const fakeNet = makeFakeNet(); const sandboxedLog4js = sandbox.require('../../lib/log4js', { requires: { net: fakeNet, }, }); sandboxedLog4js.configure({ appenders: { default: { type: 'tcp' }, }, categories: { default: { appenders: ['default'], level: 'debug' }, }, }); const logger = sandboxedLog4js.getLogger(); logger.info('before connect'); t.test( 'should buffer messages written before socket is connected', (assert) => { assert.equal(fakeNet.data.length, 0); assert.equal(fakeNet.createConnectionCalled, 1); assert.end(); } ); fakeNet.cbs.connect(); t.test('should flush buffered messages', (assert) => { assert.equal(fakeNet.data.length, 1); assert.equal(fakeNet.createConnectionCalled, 1); assert.match(fakeNet.data[0], 'before connect'); assert.end(); }); logger.info('after connect'); t.test( 'should write log messages to socket as flatted strings with a terminator string', (assert) => { assert.equal(fakeNet.data.length, 2); assert.match(fakeNet.data[0], 'before connect'); assert.ok(fakeNet.data[0].endsWith('__LOG4JS__')); assert.match(fakeNet.data[1], 'after connect'); assert.ok(fakeNet.data[1].endsWith('__LOG4JS__')); assert.equal(fakeNet.encoding, 'utf8'); assert.end(); } ); fakeNet.cbs.error(); logger.info('after error, before close'); fakeNet.cbs.close(); logger.info('after close, before connect'); fakeNet.cbs.connect(); logger.info('after error, after connect'); t.test('should attempt to re-open the socket on error', (assert) => { assert.equal(fakeNet.data.length, 5); assert.equal(fakeNet.createConnectionCalled, 2); assert.match(fakeNet.data[2], 'after error, before close'); assert.match(fakeNet.data[3], 'after close, before connect'); assert.match(fakeNet.data[4], 'after error, after connect'); assert.end(); }); t.test('should buffer messages until drain', (assert) => { const previousLength = fakeNet.data.length; logger.info('should not be flushed'); assert.equal(fakeNet.data.length, previousLength); assert.notMatch( fakeNet.data[fakeNet.data.length - 1], 'should not be flushed' ); fakeNet.cbs.drain(); assert.equal(fakeNet.data.length, previousLength + 1); assert.match( fakeNet.data[fakeNet.data.length - 1], 'should not be flushed' ); assert.end(); }); t.test('should serialize an Error correctly', (assert) => { const previousLength = fakeNet.data.length; logger.error(new Error('Error test')); fakeNet.cbs.drain(); assert.equal(fakeNet.data.length, previousLength + 1); const raw = fakeNet.data[fakeNet.data.length - 1]; const offset = raw.indexOf('__LOG4JS__'); assert.ok( flatted.parse(raw.slice(0, offset !== -1 ? offset : 0)).data[0].stack, `Expected:\n\n${fakeNet.data[6]}\n\n to have a 'data[0].stack' property` ); const actual = flatted.parse(raw.slice(0, offset !== -1 ? offset : 0)) .data[0].stack; assert.match(actual, /^Error: Error test/); assert.end(); }); t.end(); }); batch.end(); });
const { test } = require('tap'); const net = require('net'); const flatted = require('flatted'); const sandbox = require('@log4js-node/sandboxed-module'); const log4js = require('../../lib/log4js'); const LoggingEvent = require('../../lib/LoggingEvent'); let messages = []; let server = null; function makeServer(config) { server = net.createServer((socket) => { socket.setEncoding('utf8'); socket.on('data', (data) => { data .split(config.endMsg) .filter((s) => s.length) .forEach((s) => { messages.push(config.deserialise(s)); }); }); }); server.unref(); return server; } function makeFakeNet() { return { data: [], cbs: {}, createConnectionCalled: 0, createConnection(port, host) { const fakeNet = this; this.port = port; this.host = host; this.createConnectionCalled += 1; return { on(evt, cb) { fakeNet.cbs[evt] = cb; }, write(data, encoding) { fakeNet.data.push(data); fakeNet.encoding = encoding; return false; }, end() { fakeNet.closeCalled = true; }, }; }, createServer(cb) { const fakeNet = this; cb({ remoteAddress: '1.2.3.4', remotePort: '1234', setEncoding(encoding) { fakeNet.encoding = encoding; }, on(event, cb2) { fakeNet.cbs[event] = cb2; }, }); return { listen(port, host) { fakeNet.port = port; fakeNet.host = host; }, }; }, }; } test('TCP Appender', (batch) => { batch.test('Default Configuration', (t) => { messages = []; const serverConfig = { endMsg: '__LOG4JS__', deserialise: (log) => LoggingEvent.deserialise(log), }; server = makeServer(serverConfig); server.listen(() => { const { port } = server.address(); log4js.configure({ appenders: { default: { type: 'tcp', port }, }, categories: { default: { appenders: ['default'], level: 'debug' }, }, }); const logger = log4js.getLogger(); logger.info('This should be sent via TCP.'); logger.info('This should also be sent via TCP and not break things.'); log4js.shutdown(() => { server.close(() => { t.equal(messages.length, 2); t.match(messages[0], { data: ['This should be sent via TCP.'], categoryName: 'default', context: {}, level: { levelStr: 'INFO' }, }); t.match(messages[1], { data: ['This should also be sent via TCP and not break things.'], categoryName: 'default', context: {}, level: { levelStr: 'INFO' }, }); t.end(); }); }); }); }); batch.test('Custom EndMessage String', (t) => { messages = []; const serverConfig = { endMsg: '\n', deserialise: (log) => LoggingEvent.deserialise(log), }; server = makeServer(serverConfig); server.listen(() => { const { port } = server.address(); log4js.configure({ appenders: { customEndMsg: { type: 'tcp', port, endMsg: '\n' }, }, categories: { default: { appenders: ['customEndMsg'], level: 'debug' }, }, }); const logger = log4js.getLogger(); logger.info('This should be sent via TCP using a custom EndMsg string.'); logger.info( 'This should also be sent via TCP using a custom EndMsg string and not break things.' ); log4js.shutdown(() => { server.close(() => { t.equal(messages.length, 2); t.match(messages[0], { data: ['This should be sent via TCP using a custom EndMsg string.'], categoryName: 'default', context: {}, level: { levelStr: 'INFO' }, }); t.match(messages[1], { data: [ 'This should also be sent via TCP using a custom EndMsg string and not break things.', ], categoryName: 'default', context: {}, level: { levelStr: 'INFO' }, }); t.end(); }); }); }); }); batch.test('Custom Layout', (t) => { messages = []; const serverConfig = { endMsg: '__LOG4JS__', deserialise: (log) => JSON.parse(log), }; server = makeServer(serverConfig); log4js.addLayout( 'json', () => function (logEvent) { return JSON.stringify({ time: logEvent.startTime, message: logEvent.data[0], level: logEvent.level.toString(), }); } ); server.listen(() => { const { port } = server.address(); log4js.configure({ appenders: { customLayout: { type: 'tcp', port, layout: { type: 'json' }, }, }, categories: { default: { appenders: ['customLayout'], level: 'debug' }, }, }); const logger = log4js.getLogger(); logger.info('This should be sent as a customized json.'); logger.info( 'This should also be sent via TCP as a customized json and not break things.' ); log4js.shutdown(() => { server.close(() => { t.equal(messages.length, 2); t.match(messages[0], { message: 'This should be sent as a customized json.', level: 'INFO', }); t.match(messages[1], { message: 'This should also be sent via TCP as a customized json and not break things.', level: 'INFO', }); t.end(); }); }); }); }); batch.test('when underlying stream errors', (t) => { const fakeNet = makeFakeNet(); const sandboxedLog4js = sandbox.require('../../lib/log4js', { requires: { net: fakeNet, }, }); sandboxedLog4js.configure({ appenders: { default: { type: 'tcp' }, }, categories: { default: { appenders: ['default'], level: 'debug' }, }, }); const logger = sandboxedLog4js.getLogger(); logger.info('before connect'); t.test( 'should buffer messages written before socket is connected', (assert) => { assert.equal(fakeNet.data.length, 0); assert.equal(fakeNet.createConnectionCalled, 1); assert.end(); } ); fakeNet.cbs.connect(); t.test('should flush buffered messages', (assert) => { assert.equal(fakeNet.data.length, 1); assert.equal(fakeNet.createConnectionCalled, 1); assert.match(fakeNet.data[0], 'before connect'); assert.end(); }); logger.info('after connect'); t.test( 'should write log messages to socket as flatted strings with a terminator string', (assert) => { assert.equal(fakeNet.data.length, 2); assert.match(fakeNet.data[0], 'before connect'); assert.ok(fakeNet.data[0].endsWith('__LOG4JS__')); assert.match(fakeNet.data[1], 'after connect'); assert.ok(fakeNet.data[1].endsWith('__LOG4JS__')); assert.equal(fakeNet.encoding, 'utf8'); assert.end(); } ); fakeNet.cbs.error(); logger.info('after error, before close'); fakeNet.cbs.close(); logger.info('after close, before connect'); fakeNet.cbs.connect(); logger.info('after error, after connect'); t.test('should attempt to re-open the socket on error', (assert) => { assert.equal(fakeNet.data.length, 5); assert.equal(fakeNet.createConnectionCalled, 2); assert.match(fakeNet.data[2], 'after error, before close'); assert.match(fakeNet.data[3], 'after close, before connect'); assert.match(fakeNet.data[4], 'after error, after connect'); assert.end(); }); t.test('should buffer messages until drain', (assert) => { const previousLength = fakeNet.data.length; logger.info('should not be flushed'); assert.equal(fakeNet.data.length, previousLength); assert.notMatch( fakeNet.data[fakeNet.data.length - 1], 'should not be flushed' ); fakeNet.cbs.drain(); assert.equal(fakeNet.data.length, previousLength + 1); assert.match( fakeNet.data[fakeNet.data.length - 1], 'should not be flushed' ); assert.end(); }); t.test('should serialize an Error correctly', (assert) => { const previousLength = fakeNet.data.length; logger.error(new Error('Error test')); fakeNet.cbs.drain(); assert.equal(fakeNet.data.length, previousLength + 1); const raw = fakeNet.data[fakeNet.data.length - 1]; const offset = raw.indexOf('__LOG4JS__'); assert.ok( flatted.parse(raw.slice(0, offset !== -1 ? offset : 0)).data[0].stack, `Expected:\n\n${fakeNet.data[6]}\n\n to have a 'data[0].stack' property` ); const actual = flatted.parse(raw.slice(0, offset !== -1 ? offset : 0)) .data[0].stack; assert.match(actual, /^Error: Error test/); assert.end(); }); t.end(); }); batch.end(); });
-1
log4js-node/log4js-node
1,332
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`
Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
lamweili
"2022-10-01T10:52:35Z"
"2022-10-01T10:57:05Z"
916eef11f1d2aa2f32765f956f1f674745feb8b6
570ef530dc02d3e843a5421cb015bb8fadfe0b41
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`. Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
./examples/date-file-rolling.js
'use strict'; const log4js = require('../lib/log4js'); log4js.configure({ appenders: { file: { type: 'dateFile', filename: 'thing.log', numBackups: 3, pattern: '.mm', }, }, categories: { default: { appenders: ['file'], level: 'debug' }, }, }); const logger = log4js.getLogger('thing'); setInterval(() => { logger.info('just doing the thing'); }, 1000);
'use strict'; const log4js = require('../lib/log4js'); log4js.configure({ appenders: { file: { type: 'dateFile', filename: 'thing.log', numBackups: 3, pattern: '.mm', }, }, categories: { default: { appenders: ['file'], level: 'debug' }, }, }); const logger = log4js.getLogger('thing'); setInterval(() => { logger.info('just doing the thing'); }, 1000);
-1
log4js-node/log4js-node
1,332
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`
Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
lamweili
"2022-10-01T10:52:35Z"
"2022-10-01T10:57:05Z"
916eef11f1d2aa2f32765f956f1f674745feb8b6
570ef530dc02d3e843a5421cb015bb8fadfe0b41
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`. Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
./examples/rabbitmq-appender.js
// Note that rabbitmq appender needs install amqplib to work. const log4js = require('../lib/log4js'); log4js.configure({ appenders: { out: { type: 'console', }, file: { type: 'dateFile', filename: 'logs/log.txt', pattern: 'yyyyMMdd', alwaysIncludePattern: false, }, mq: { type: '@log4js-node/rabbitmq', host: '127.0.0.1', port: 5672, username: 'guest', password: 'guest', routing_key: 'logstash', exchange: 'exchange_logs', mq_type: 'direct', durable: true, layout: { type: 'pattern', pattern: '%d{yyyy-MM-dd hh:mm:ss:SSS}#%p#%m', }, }, }, categories: { default: { appenders: ['out'], level: 'info' }, dateFile: { appenders: ['file'], level: 'info' }, rabbitmq: { appenders: ['mq'], level: 'info' }, }, }); const log = log4js.getLogger('console'); const logRabbitmq = log4js.getLogger('rabbitmq'); function doTheLogging(x) { log.info('Logging something %d', x); logRabbitmq.info('Logging something %d', x); } for (let i = 0; i < 500; i += 1) { doTheLogging(i); }
// Note that rabbitmq appender needs install amqplib to work. const log4js = require('../lib/log4js'); log4js.configure({ appenders: { out: { type: 'console', }, file: { type: 'dateFile', filename: 'logs/log.txt', pattern: 'yyyyMMdd', alwaysIncludePattern: false, }, mq: { type: '@log4js-node/rabbitmq', host: '127.0.0.1', port: 5672, username: 'guest', password: 'guest', routing_key: 'logstash', exchange: 'exchange_logs', mq_type: 'direct', durable: true, layout: { type: 'pattern', pattern: '%d{yyyy-MM-dd hh:mm:ss:SSS}#%p#%m', }, }, }, categories: { default: { appenders: ['out'], level: 'info' }, dateFile: { appenders: ['file'], level: 'info' }, rabbitmq: { appenders: ['mq'], level: 'info' }, }, }); const log = log4js.getLogger('console'); const logRabbitmq = log4js.getLogger('rabbitmq'); function doTheLogging(x) { log.info('Logging something %d', x); logRabbitmq.info('Logging something %d', x); } for (let i = 0; i < 500; i += 1) { doTheLogging(i); }
-1
log4js-node/log4js-node
1,332
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`
Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
lamweili
"2022-10-01T10:52:35Z"
"2022-10-01T10:57:05Z"
916eef11f1d2aa2f32765f956f1f674745feb8b6
570ef530dc02d3e843a5421cb015bb8fadfe0b41
fix(LoggingEvent): serde for `NaN`, `Infinity`, `-Infinity`, `undefined`. Fixes #1187 Supersedes PR #1188 ## Affected Components Only affects clustering, multiprocessAppender, and tcpAppender. These three will `serialise()` to `String` to transmit for the receiver to `deserialise()`. | Code | Object<br>(Input) | Serialised<br>(Transmission) | Deserialised<br>(Output) | Match |-|-|-|-|-| `{"a": Number("abc")}` | `{"a": NaN}` | `{"a": "__LOG4JS_NaN__"}` | `{"a": NaN}` | ✔️ | `{"b": 1/0}` | `{"b": Infinity}` | `{"b": "__LOG4JS_Infinity__"}` | `{"b": Infinity}` | ✔️ | `{"c": -1/0}` | `{"c": -Infinity}` | `{"c": "__LOG4JS_-Infinity__"}` | `{"c": -Infinity}` | ✔️ | `[undefined]` | `[undefined]` | `["__LOG4JS_undefined__"]` | `[undefined]` | ✔️ | Compared to PR #1188, now the output matches exactly the input.
./examples/custom-layout.js
const log4js = require('../lib/log4js'); log4js.addLayout( 'json', (config) => function (logEvent) { return JSON.stringify(logEvent) + config.separator; } ); log4js.configure({ appenders: { out: { type: 'stdout', layout: { type: 'json', separator: ',' } }, }, categories: { default: { appenders: ['out'], level: 'info' }, }, }); const logger = log4js.getLogger('json-test'); logger.info('this is just a test'); logger.error('of a custom appender'); logger.warn('that outputs json'); log4js.shutdown(() => {});
const log4js = require('../lib/log4js'); log4js.addLayout( 'json', (config) => function (logEvent) { return JSON.stringify(logEvent) + config.separator; } ); log4js.configure({ appenders: { out: { type: 'stdout', layout: { type: 'json', separator: ',' } }, }, categories: { default: { appenders: ['out'], level: 'info' }, }, }); const logger = log4js.getLogger('json-test'); logger.info('this is just a test'); logger.error('of a custom appender'); logger.warn('that outputs json'); log4js.shutdown(() => {});
-1