Be careful: what if you delete part of a file, and the process is interrupted in the middle?
Disk space is cheap, so there is hardly ever any point in trying to save disk space in this way.
If you really have to do this, my recommendation is to process the file in chunks. Take a chunk, encrypt it, then delete that part of the original file. Then process the next chunk, and so on. This way, if the process is interrupted, you can remove the unfinished encrypted chunk and resume the process. With this approach, it's easier to process the file from the end to the beginning.
Warning: untested code. Assumes GNU or BusyBox utilities (e.g. Linux) and a 64-bit shell (to handle file sizes above 2GB).
#!/bin/sh
set -e
input_size=$(stat -c %s myfile)
fragment_size=$((1024*1024*1024))
while [ "$input_size" -gt 0 ]; do
chunk_size=$((input_size % fragment_size)) || chunk_size=$fragment_size
input_size=$((input_size - fragment_size))
fragment_name=$(printf fragment-%08d "$((input_size / fragment_size))")
tail -c "$((input_size+1))" myfile | gpg -e >"$fragment_name.gpg.tmp"
mv "$fragment_name.gpg.tmp" "$fragment_name.gpg"
truncate -s "$input_size" myfile
done