At our organization, we run a lot of ad-hoc DML to correct bugs or pick up the slack until functionality is added or fixed. As such, we like to keep records of these transactions for ourselves as well as auditors. We would like to use the sqlcmd utility to capture the commands and results as they are run by the DBAs. The most common approach is to use the "-o" command line option. However, doing this causes us to lose the interactive nature of the session.
For example: "sqlcmd -E -S <server> -e -W -o sql_out.txt" will send all output to the sql_out.txt file, but now I lose the ability to analyze the results from my DML as it is sent to a file, but not immediately written out (still in the buffer). For instance, a developer or BSA may send us a statement that is expected to affect 100 records and if we begin the transaction, run the update, and then it comes back affecting 1,000 records, we can issue a rollback transaction as opposed to commit transaction after this feedback is presented to us. This feedback is lost if I use the "-o" option. I thought about using the Get-Content cmdlet against the generated log, but the output is buffered and so I can't analyze the statements just executed, I must wait for the buffer to flush to the file. This is obviously unacceptable for our situation.
I tried using the Start-Transcript cmdlet in PowerShell as well, but once I'm inside sqlcmd, it overrides the PowerShell cmdlet and none of my SQL commands or results show up in the transcript.