- Version: 13.3.0
- Platform: Windows 10 x64
const fs = require('fs') fs.readFile('nonexistentfile', (err) => console.log(err.stack))
What is the expected output?
The err.stack should contain error text and stack
Same as readFileSync function:
Error: ENOENT: no such file or directory, open ‘nonexistentfile’
at Object.openSync (fs.js:446:3)
at Object.readFileSync (fs.js:348:35)
…
What do you see instead?
err.stack only contains error text
Error: ENOENT: no such file or directory, open ‘C:devprojectsntestnonexistentfile’
fs.writeFile
have same trouble
Аnother strange thing is that the error text is also different (local/absolute path)
This is sadly a known issue. Tracking the actual call site would be a significant impact for all fs
functions and it is therefore not implemented so far. We tried to find solutions but failed so far.
I thought there would already be an issue for this but I failed finding it.
BridgeAR
added
confirmed-bug
Issues with confirmed bugs.
fs
Issues and PRs related to the fs subsystem / file system.
labels
Dec 13, 2019
There’s definitely a number of userland solutions for this issue, though, e.g. https://www.npmjs.com/package/trace.
I wouldn’t mind bringing such an opt-in solution into core.
@addaleax good point, we might just add a flag for that.
can someone provide a summary of what the technical issue here is?
@devsnek tracking the original call site for the async operation. The only way to really do that is (AFAIK) to create a stack trace for all calls. So it’ll reduce the average performance for successful fs calls significantly.
am i right in saying that the problem is that the error is created in c++ land while the js stack is empty?
I believe we do not even attempt to create a proper error in that case. It would not point to the actual call site.
@devsnek Yes, that is correct. The exception is created in FSReqAfterScope::Reject
.
I’m actually not sure anymore if long stack trace modules solve this. Fixing this might also require us to add better async support for these kinds of operations; the error is currently created outside of the async scope, so no async tracking is available.
One thing that might be helpful for making async stack traces work better here is making FSReqAfterScope
create an InternalCallbackScope
(and remove the one in FSReqPromise<AliasedBufferT>::Resolve
+ replace MakeCallback
s with a plain Call()
). That way, the error could be created inside the callback scope.
@heroboy If you’re talking about the await optimization proposal, that’s been implemented in Node.js v12.x and up. You can check if your node binary supports it like this:
$ node --v8-options | grep async.stack.traces
--async-stack-traces (include async stack traces in Error.stack)
Note that the optimization won’t work if your program monkey-patches Promise
.
@bnoordhuis , No. I know current nodejs support this. And I know only async await functions have good support to async stack traces, many Promise functions will lost the stack.So if this issues fixed, will the fs.promises
functions have the full stack traces? Will you make effort to make many nodejs’ promisify functions to support this?
There’s no need to do anything on Node’s side, if I understand your question correctly. It’s all done inside V8. The one caveat is that it only works with await
and native promises.
aduh95
changed the title
fs callback functions return Errors without stack
fs async functions (both callback and promises APIs) return Errors without stack trace
Jul 2, 2021
I’m in a situation where I really need stack traces to debug an fs-intensive application (a build system), yet the available information about what is necessary to achieve fs tracing is vague. From this thread I understand the following:
- Current node versions have the ability to propagate stack traces through async calls.
- That doesn’t help because there’s nothing to propagate in the first place.
The only ways to rectify the situation would then be to either change to sync fs calls or to wrap every call in a try-catch that throws its own error. Have I summed up the situation correctly?
I tested the following code on node v16.13.1 on MacOS:
// async.js const fs = require("fs/promises"); process.on("uncaughtException", (e) => { console.error(e.stack || e); }); async function main() { await fs.readFile("doesntexist"); } main();
I have not yet discovered any usable environmental workaround. node -r trace ./async.js
makes no difference.
Is there a solution that doesn’t change the code?
I’ve gotten around this by doing:
const fs = require(`fs`); try { await fs.promises.readFile(`filePath`).catch((err)=>{ throw new Error(err); }); } catch (error) { console.error(error); }
Something like this may work too:
fs.readFile(`filePath`, (err)=>{ if (err) throw new Error(err); });
Hmm, this is a tricky one. Even detecting whether the file exists as a pre-flight doesn’t guarantee that you can successfully open it (it could be locked or have permission issues) and detecting if it exists before opening is a classic race condition in server development (small window, but still a race condition).
I’m still thinking there must be a better way to get an error out of a fs.createReadStream()
, but the only way I could find was to wrap it in a promise that only resolves when the file is successfully open. That lets you get the error from opening the file and propagate it back to the caller of your async
function. Here’s what that would look like:
const fs = require('fs');
const readline = require('readline');
function createReadStreamSafe(filename, options) {
return new Promise((resolve, reject) => {
const fileStream = fs.createReadStream(filename, options);
fileStream.on('error', reject).on('open', () => {
resolve(filestream);
});
});
}
async function processLineByLine(f) {
const fileStream = await createReadStreamSafe(f);
const rl = readline.createInterface({
input: fileStream,
crlfDelay: Infinity
});
for await (const line of rl) {
// Each line in input.txt will be successively available here as `line`.
console.log(`Line from file: ${line}`);
}
}
processLineByLine("nofile").catch(err => {
console.log("caught error");
});
This makes it so that the promise that processLineByLine()
returns will reject and you can handle the error there which is what I think you were asking for. If I misunderstood what you were asking for, then please clarify.
FYI, this seems to me to be a bug in readline.createInterface()
because it seems like it should reject on the first iteration of for await (const line of rl)
, but that doesn’t appear to be what happens.
So, as a consequence of that, even this work-around won’t detect read errors on the stream after it’s opened. That really needs to be fixed internal to createInterface()
. I agree both a file open error or a read error should show up as a reject on for await (const line of rl)
.
Another work-around for the file open issue would be to pre-open the file using await fs.promises.open(...)
and pass the fd
to fs.createReadStream
and then you would see the error on the open yourself.
A Different Solution — Wrapping the readLine iterator to add error handling
Warning, this ends up looking like a bit of hack, but it’s a really interesting learning project because I ended up having to wrap an the readline asyncIterator
with my own in order to reject when I detected an error on the readStream
(the error handling that the readline
library is missing).
I set out on a mission to figure out how to write a processLineByLine()
function that would return an asyncIterator
that would properly reject on stream errors (even though the readline
code has bugs in this regard) while still using the readline library internally.
The goal was to be able to write code like this:
for await (let line of processLineByLine("somefile1.txt")) {
console.log(line);
}
that properly handles errors on the readStream used internally, whether the file doesn’t exist, exists but can’t be opened or even encounters a read error later while reading. Since I’m not changing/fixing the readline interface code internally, I had to install my own error
listener on the readStream and when I see an error there, I need to cause any pending or future promises from the readline interface to reject.
Here’s what I ended up with:
// This is an experiment to wrap the lines asyncIterator with our own iterator
// so we can reject when there's been an error on the readStream. It's really
// ugly, but does work.
const fs = require('fs');
const readline = require('readline');
function processLineByLine(filename, options = {}) {
const fileStream = fs.createReadStream(filename, options);
let latchedError = null;
let kill = new Set();
fileStream.on('error', (err) => {
latchedError = err;
// any open promises waiting on this stream, need to get rejected now
for (let fn of kill) {
fn(err);
}
});
const lines = readline.createInterface({
input: fileStream,
crlfDelay: Infinity
});
// create our own little asyncIterator that wraps the lines asyncIterator
// so we can reject when we need to
function asyncIterator() {
const linesIterator = lines[Symbol.asyncIterator]();
return {
next: function() {
if (latchedError) {
return Promise.reject(latchedError);
} else {
return new Promise((resolve, reject) => {
// save reject handlers in higher scope so they can be called
// from the stream error handler
kill.add(reject);
let p = linesIterator.next();
// have our higher level promise track the iterator promise
// except when we reject it from the outside upon stream error
p.then((data => {
// since we're resolving now, let's removing our reject
// handler from the kill storage. This will allow this scope
// to be properly garbage collected
kill.delete(reject);
resolve(data);
}), reject);
});
}
}
}
}
var asyncIterable = {
[Symbol.asyncIterator]: asyncIterator
};
return asyncIterable;
}
async function runIt() {
for await (let line of processLineByLine("xfile1.txt")) {
console.log(line);
}
}
runIt().then(() => {
console.log("done");
}).catch(err => {
console.log("final Error", err);
});
Some explanation on how this works…
Our own error monitoring on the stream
First, you can see this:
fileStream.on('error', (err) => {
latchedError = err;
// any open promises waiting on this stream, need to get rejected now
for (let fn of kill) {
fn(err);
}
});
This is our own error monitoring on the readStream to make up for the missing error handling inside of readline. Anytime we see an error, we save it in a higher scoped variable for potential later use and, if there are any pending promises registered from readline for this stream, we «kill» them (which rejects them, you will see later how that works).
No special handling for file open errors
Part of the goal here was to get rid of the special handling in the previous solution for file open errors. We want ANY error on the readStream to trigger a rejection of the asyncIterable so this is a much more general purpose mechanism. the file open error gets caught in this error handling just the same way any other read error would.
Our own asyncIterable and asyncIterator
Calling readline.createInterace()
returns an asyncIterable. It’s basically the same as a regular iterable in that you call a special property on it to get an asyncIterator
. That asyncIterator
has a .next()
property on it just like a regular iterator except when asyncIterator.next()
is called, it returns a promise that resolves to an object instead of an object.
So, that’s how for await (let line of lines)
works. It first calls lines[Symbol.asyncIterator]()
to get an asyncIterator. Then, on that asyncIterator
that it gets back, it repeatedly does await asyncIterator.next()
waiting on the promise that asyncIterator.next()
returns.
Now, readline.createInterface()
already returns such an asyncIterable
. But, it doesn’t work quite right. When the readStream
gets an error, it doesn’t reject the promise returned by .next()
on each iteration. In fact, that promise never gets rejected or resolved. So, things get stalled. In my test app, the app would just exit because the readStream was done (after the error) and there was no longer anything keeping the app from exiting, even though a promise was still pending.
So, I needed a way to force that promise that readlineIterator.next()
had previously returned and was currently being awaited by for await (...)
to be rejected. Well, a promise doesn’t provide an outward interface for rejecting it and we don’t have access to the internals to the readline
implementation where there is access to reject it.
My solution was to wrap the readlineIterator with my own as a sort of proxy. Then, we my own error detector sees an error and there are promise(s) outstanding from readline, I can use my proxy/wrapper to force a rejection on those outstanding promise(s). This will cause the for await (...)
to see the reject and get a proper error. And, it works.
It took me awhile to learn enough about how asyncIterators
work to be able to wrap one. I owe a lot of thanks to this Asynchronous Iterators in JavaScript article which provided some very helpful code examples for constructing your own asyncIterable and asyncIterator. This is actually where the real learning came about in this exercise and where others might learn by understanding how this works in the above code.
Forcing a wrapped promise to reject
The «ugliness» in this code comes in forcing a promise to reject from outside the usual scope of the reject handler for that promise. This is done by storing the reject handler in a higher level scope where an error handling for the readStream
can call trigger that promise to reject. There may be a more elegant way to code this, but this works.
Making our own asyncIterable
An async iterable is just an object that has one property on it named [Symbol.asyncIterator]
. That property must be a function that, when called with no arguments, returns an asyncIterator.
So, here’s our asyncIterable
.
var asyncIterable = {
[Symbol.asyncIterator]: asyncIterator
};
Making our own asyncIterator
An asyncIterator
is a function that when called returns an object with a next()
property on it. Each time obj.next()
is called, it returns a promise that resolves to the usual iterator tuple object {done, value}
. We don’t have to worry about the resolved value because we’ll just get that from the readline’s iterator. So, here’s our asyncIterator
:
// create our own little asyncIterator that wraps the lines asyncIterator
// so we can reject when we need to
function asyncIterator() {
const linesIterator = lines[Symbol.asyncIterator]();
return {
next: function() {
if (latchedError) {
return Promise.reject(latchedError);
} else {
return new Promise((resolve, reject) => {
// save reject handlers in higher scope so they can be called
// from the stream error handler
kill.push(reject);
let p = linesIterator.next();
// have our higher level promise track the iterator promise
// except when we reject it from the outside upon stream error
p.then(resolve, reject);
});
}
}
}
}
First, it gets the asyncIterator from the readline interface (the one we’re proxying/wrapping) and stores it locally in scope so we can use it later.
Then, it returns the mandatory iterator structure of the form {next: fn}
. Then, inside that function is where our wrapping logic unfolds. If we’ve seen a previous latched error, then we just always return Promise.reject(latchedError);
. If there’s no error, then we return a manually constructed promise.
Inside the executor function for that promise, we register our reject handling by adding it into a higher scoped Set
named kill
. This allows our higher scoped filestream.on('error', ....)
handler to reject this promise if it sees an error by calling that function.
Then, we call linesIterator.next()
to get the promise that it returns. We register an interest in both the resolve and reject callbacks for that promise. If that promise is properly resolved, we remove our reject handler from the higher level scope (to enable better garbage collection of our scope) and then resolve our wrap/proxy promise with the same resolved value.
If that linesIterator promise rejects, we just pass the reject right through our wrap/proxy promise.
Our own filestream error handling
So, now the final piece of explanation. We have this error handler watching the stream:
fileStream.on('error', (err) => {
latchedError = err;
// any open promises waiting on this stream, need to get rejected now
for (let fn of kill) {
fn(err);
}
});
This does two things. First, it stores/latches the error so any future calls to the lines iterator will just reject with this previous error. Second, if there are any pending promises from the lines iterator waiting to be resolved, it cycles through the kill
Set and rejects those promises. This is what gets the asyncIterator promise to get properly rejected. This should be happening inside the readline
code, but since it isn’t doing it properly, we force our wrap/proxy promise to reject so the caller sees the proper rejection when the stream gets an error.
In the end, you can just do this as all the ugly detail is hidden behind the wrapped asyncIterable
:
async function runIt() {
for await (let line of processLineByLine("xfile1.txt")) {
console.log(line);
}
}
runIt().then(() => {
console.log("done");
}).catch(err => {
console.log("final Error", err);
});
При подключении к серверу Альт Линукс Школьный по WEB (https://myserver:8080)
после ввода пароля root выдается вот такая запись:
Async request failed
Request status: error
Response:
OK
и все… Хотя вот буквально неделю назад спокойно по ВЭБу коннектился
С консоли спокойно пускает а с ВЭБа не хочет… Подскажите «ламеру» — где рыть?
« Последнее редактирование: 18.11.2010 00:21:56 от Skull »
Записан
service alteratord resart
в консоли сервера. Или перегрузиться.
service ahttpd restart
Записан
Андрей Черепанов (cas@)
перегружаться пробовал, не помогает…
Записан
Записан
Андрей Черепанов (cas@)
нет, т.к. в автоматическом режиме после произведенных обновлений почти весь ВЭБ интерфейс стал на английском, а у меня есть люди, которые в нем не сильны… поэтому пришлось снести весь сервер, поставить начисто заново и не производить обновления…
Записан
Наверное надо было что-то доставить, или откорректировать, что-бы было по-русски.
Т.е. сервер поставили заново, а клиенты настроены на старый сервер?
Записан
Да клиенты, собссно, толком и не заводились, пока шли и идут эксперименты, в т.ч. и с обновлениями!
Но вот с обновлениями что-то не так пошло.
Поэтому и было принято решение снова все поставить начисто, не обновляться, и уже с чистого листа начать…
Но теперь новый глюк — не пущает по ВЭБ-морде…
Записан
Наверное надо было что-то доставить, или откорректировать, что-бы было по-русски.
А разве это не готовое решение, обновление которого не требует дополнительных настроек?
Я почему то считал, что обновления не требуют доп.настроек
Записан
Можно сказать, как правило, не требует, но бывают исключения по различным причинам.
Записан
мдя…еще бы знать эти причины
Записан
Вопчем проблема решилась… С ВЭБ интерфейса вхожу…
Обратил внимание на то, что резервными копиями был забит весь диск /var/srv
причем последняя копия даже не сформировалась, о чем внятно было написано в логах
снес с консоли все резервные копии, перезапустил сервер и зашел с ВЭБ-интерфейса
Всем спасибо за участие!
Записан