It works ok as a single tool:

curl "someURL"
curl -o - "someURL"

but it doesn't work in a pipeline:

curl "someURL" | tr -d '\n'
curl -o - "someURL" | tr -d '\n'

it returns:

(23) Failed writing body

What is the problem in the piping the curl output? How to buffer the whole curl output and then handle it?

share|improve this question
1  
For me it works, no need to buffer. – hek2mgl May 23 '13 at 0:20
    
does this work in pipeline too?: curl 'http://www.multitran.ru/c/m.exe?CL=1&s=hello&l1=1' | tr -d '\n' – static May 23 '13 at 0:22
1  
Added osx tags. Unfortunately I cannot help with this. I'm using Linux – hek2mgl May 23 '13 at 0:29
1  
the problem was encoding of the page (cyrilic, win-1251). So I must use iconv -f ... – static May 23 '13 at 0:59
1  
Just as another hint: Mine failed, because the disk was full. – Vince Varga Oct 26 '16 at 12:32

This happens when a piped program (e.g. grep) closes the read pipe before the previous program is finished writing the whole page.

In curl "url" | grep -qs foo, as soon as grep has what it wants it will close the read stream from curl. cURL doesn't expect this and emits the "Failed writing body" error.

A workaround is to pipe the stream through an intermediary program that always reads the whole page before feeding it to the next program.

E.g.

curl "url" | tac | tac | grep -qs foo

tac is a simple Unix program that reads the entire input page and reverses the line order (hence we run it twice). Because it has to read the whole input to find the last line, it will not output anything to grep until cURL is finished. Grep will still close the read stream when it has what it's looking for, but it will only affect tac, which doesn't emit an error.

share|improve this answer
    
Could you not simply pipe it through cat once? Solves the issue for me, at least. – benvd Jun 10 '16 at 14:26
3  
No. It might help with small documents but when it is too large to fit in the buffer cat uses the error will reappear.You could use -s to silence all error messages (and progress) if you don't need them. – Kaworu Jun 13 '16 at 10:37
    
tac|tac changes the input if input does not end with a linefeed, or for example printf a\\nb\\nc|tac|tac prints a\ncb where \n is a linefeed. You can use sponge /dev/stdout instead. Another option is printf %s\\n "$(cat)", but when the input contains null bytes in shells other than Zsh, that either skips the null bytes or stops reading after the first null byte. – user4669748 Sep 24 '16 at 18:20
    
From the docs: CURLE_WRITE_ERROR (23) An error occurred when writing received data to a local file, or an error was returned to libcurl from a write callback. curl.haxx.se/libcurl/c/libcurl-errors.html – Jordan Stewart Jan 20 at 0:47

So it was a problem of encoding. Iconv solves the problem

curl 'http://www.multitran.ru/c/m.exe?CL=1&s=hello&l1=1' | iconv -f windows-1251 | tr -dc '[:print:]' | ...
share|improve this answer

You can do this instead of using -o option:

curl [url] > [file]

share|improve this answer
    
so, not using the pipe and instead do all the work over the file system? I wanted to use the curl's output with pipes. – static Aug 20 '13 at 14:27
1  
The question says only piped sequences don't work. – Det Sep 15 '14 at 12:33

(For completeness and future searches) It 'a matter of how CURL manages the buffer, the buffer disables the output stream with the -N option.

ES: curl -s -N "URL" | grep -q Welcome

share|improve this answer
1  
that doesn't work. – Hamish Moffatt Aug 15 '16 at 6:37
    
It does work ! Thanks! – Astinus Eberhard Dec 14 '16 at 12:56

protected by Community Dec 2 '16 at 7:37

Thank you for your interest in this question. Because it has attracted low-quality or spam answers that had to be removed, posting an answer now requires 10 reputation on this site (the association bonus does not count).

Would you like to answer one of these unanswered questions instead?

Not the answer you're looking for? Browse other questions tagged or ask your own question.