|
Post by terryrocks on Aug 9, 2016 10:51:29 GMT -6
I've been reamping lately and would really like to know how to match my original instrument level signal(pre di) to the one I'm reamping(post reamp box) What's the best way to measure these levels?
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Aug 9, 2016 12:18:02 GMT -6
Hm, if i understand the application right, you want to meter/measure pure guitar output to have a value to calibrate the level at reamping time to get something reliable for same level/transient behaviour of amps etc.? I guess a buffered peak detector would do best for a comparison that is practical. I would expect peaks in the range of maybe below 1.5 volt range, you might calibrate to usable real world values. Maybe 2 ranges for passive and active guitars. This circuit might do, feed the output to a simple DC voltmeter: falstad.com/circuit/e-peak-detect.html
|
|
|
Post by svart on Aug 9, 2016 12:21:35 GMT -6
I'd take a signal generator with 1k sine wave into the amp, turn it up to level you're good with, then measure the loaded AC voltage at the amp input. Now measure the same signal from the DI playback at the amp input and match the voltage. That's probably the most scientific way to do it.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Aug 9, 2016 13:09:01 GMT -6
D'oh. svart beats me easily, i thought into the wrong direction i guess. Had a long day.... Yes, you only need a one time measurement for calibration and he explained the easiest way to do it right.
|
|