Remove long string from files

findgrepreplacesedtext processing

Files on my server have been corrupted. I want to remove a 13000 character string from all PHP files containing it.

The string looks like:

?php if(!isset($GLOBALS["\x61\156\x75\156\x61"])) { $ua=strtolower($_SERVER[ ... $qhroczocgv=$qjhvvbyvyv; $qhroczocgv=(729-608); $boxknervrr=$qhroczocgv-1; ?>

With ellipses inserted for brevity.

When I search for the string using grep, I get a

grep: Invalid back reference" despite escaping \![]$

How do I first find all files with the entire string and then how do I remove the text from every file?

Best Answer

Assuming you have decent coding conventions, just delete any line greater than a certain size:

shopt -s extglob nullglob
sed -i.bak -r '/.{10000}/d' **/*.php

for @wildcard:

find . -name '*.php' -print0 | while IFS= read -rd "" file; do
    before=$(wc -l < "$file")
    after=$(sed -r '/.{10000}/d' "$file" | wc -l)
    case $(( diff = before - after )) in
        0) :;;  # no-op
        *) echo "will remove $diff lines from $file";;
    esac
done
Related Question