sound volume

Gurtag

Member
sorry edited tread, was blaming hitfx wronggly. for some reason most sound i use have to amplyfy to oblivion to been able to hear them in game, any ideas of what can i bee doing wrong ?
 
Last edited:
sorry edited tread, was blaming hitfx wronggly. for some reason most sound i use have to amplyfy to oblivion to been able to hear them in game, any ideas of what can i bee doing wrong ?
If the sample volume is too low, you may need to increase it a bit using sound editors.
 
I also balance my sounds with Audacity, making sure that it is set to the same level, CD quality 44100Hz mono ogg.
Perhaps your sounds are of low quality and it would be necessary to adjust them to other ambient sounds.
 
I also balance my sounds with Audacity, making sure that it is set to the same level, CD quality 44100Hz mono ogg.
Perhaps your sounds are of low quality and it would be necessary to adjust them to other ambient sounds.
yea it may be a result of the other sounds been louder, anyway got them to work though with awfull quality, if i ever start other project will adjust all sounds to the lower denominator lol.
 
I also balance my sounds with Audacity, making sure that it is set to the same level, CD quality 44100Hz mono ogg.
you mean your songs, right? As we can't use .ogg as sound effects.
I've been using this method for years and it gives me a better result than anything else. For instance, I set the quality to 1 and it really shrinks the file size.

About mono/streo, you won't gain anything if you use stereo sounds - devs can correct me, but as far as I remember, the sounds are played in mono.
For musics, honestly, I never use stereo files unless there is some cool panning (changing channels) on the song. If not, I just set it to mono and save some space.
 
About mono/streo, you won't gain anything if you use stereo sounds - devs can correct me, but as far as I remember, the sounds are played in mono.

Sort of. Sound files are played in mono, but you have control of left/right channel volume. It's done that way so you can mix on the fly with multiple sound files, which is actually far more powerful that just having a stereo playback. You can use stereo files, but it's just wasting space and time.

Music does support stereo, since there's only one music channel and it's streamed. I very much do notice the difference in mono vs. stereo music (assuming of course the track itself was recorded in stereo), so I usually keep it, but that's all a matter of taste.

DC
 
hey kratus, i do it, use audacity.. but in order to hear them have to amplify them to the point of clipping, was wondering if i did something wrong on my end or was a known thing..
do you have an specific way that keps quality?
yea it may be a result of the other sounds been louder
I always avoid reaching the clipping point in samples/music, preventing quality loss and distortions.
In this case I would prefer to lower all other audio files to make all of them matching a similar volume, but having a safe distance from the clipping point.

Another tip is, I always suggest working on a backup of the original audio file when changing volume instead of always editing the same file. This is because if you increase the volume and then reach the clipping point, lowering the volume again will not fix/restore the original quality.

In this video you can see what I'm talking about.
 
I very much do notice the difference in mono vs. stereo music (assuming of course the track itself was recorded in stereo), so I usually keep it, but that's all a matter of taste.
Yep, I know. There are some other effects which could evident, like Stereo Chorus, Delay, Reverb or Phaser - all of them, in stereo.
If they are mono effects, converting the music to mono would be barelly noticiable.
 
Back
Top Bottom