Skip to main content

GAN generator loss is 0 from the start

I need data augmentation for network traffic and I'm following an article in which the structure of the discriminator and generator are both specified. My input data is a collection of pcap files, each having 10 packets with 250 bytes. They are then transformed into a (10, 250) array and all the bytes are cast into float64.

def Byte_p_to_float(byte_p):
    float_b = []
    for byte in byte_p:
        float_b.append(float(str(byte)))
    return float_b

Xtrain = []
a = []
for i in range(len(Dataset)):
    for p in range(10):
        a.append(Byte_p_to_float(raw(Dataset[i][p]))) # packets transform to byte, and the float64
    Xtrain.append(a)
    a = []
Xtrain = np.asarray(Xtrain)
Xtrain = Xtrain / 127.5 - 1.0 # normalizing.

I then go on training the model but the generator loss is always 0 from start!

batch_size = 128
interval = 5
iterations = 2000

real = [] # a list of all 1s for true data label
for i in range (batch_size):
    real.append(1)
    
fake = [] # a list of all 1s for fake data label
for i in range (batch_size):
    fake.append(0)


for iteration in range(iterations):

    ids = np.random.randint(0,Xtrain.shape[0],batch_size)
    flows = Xtrain[ids]
    
    z = np.random.normal(0, 1, (batch_size, 100)) # generating gaussian noise vector!
    gen_flows = generator_v.predict(z) 
    gen_flows = ((gen_flows - np.amin(gen_flows))/(np.amax(gen_flows) - np.amin(gen_flows))) * 2 - 1 # normalizing. (-1,+1)
    
    # gen_flows returns float32 and here i transform to float 64. not sure if its necessary
    t = np.array([])
    for i in range(batch_size):
        t = np.append(t ,[np.float64(gen_flows[i])])
    t = t.reshape(batch_size, 2500)
    gen_flows = []
    gen_flows = t
    
    nreal = np.asarray(real)
    nfake = np.asarray(fake)
    nflows = flows.reshape(batch_size, 2500) # this way we match the article.
    dloss_real = discriminator_v.train_on_batch(nflows, nreal) # training the discriminator on real data
        
    dloss_fake = discriminator_v.train_on_batch(gen_flows, nfake) # training the discriminator on fake data

    dloss, accuracy = 0.5 * np.add(dloss_real,dloss_fake)

    z = np.random.normal(0, 1, (batch_size, 100)) # generating gaussian noise vector for GAN
    gloss = gan_v.train_on_batch(z, nreal)

    if (iteration + 1) % interval == 0:
        losses.append((dloss, gloss))
        accuracies.append(100.0 * accuracy)
        iteration_checks.append(iteration + 1)
        print("%d [D loss: %f , acc: %.2f] [G loss: %f]" % (iteration+1,dloss,100.0*accuracy,gloss))
        

[the model description in the article is here][1]

and finally here is my model:

losses=[]
accuracies=[]
iteration_checks=[]
zdim = np.random.normal(0,1,100) # 100 dimentional gaussian noise vector

def build_generator(gause_len):
    model = Sequential()

    model.add(Input(shape=(gause_len,)))
    model.add(Dense(256))
    model.add(LeakyReLU())
    model.add(BatchNormalization())

    model.add(Dense(512))
    model.add(LeakyReLU())
    model.add(BatchNormalization())

    model.add(Dense(1024))
    model.add(LeakyReLU())
    model.add(BatchNormalization())

    model.add(Dense(2500))
    model.add(LeakyReLU(2500))
    #model.add(reshape(img_shape))
    return model
def build_discriminator():
    model = Sequential()
    
    
    model.add(Input(shape=(2500))) #input shape
    model.add(Dense(2500))
    #model.add( Dense(2500, input_shape=img_shape) )
    model.add(LeakyReLU())
    model.add(Dropout(0.5))
    

    model.add(Dense(1024, ))
    model.add(LeakyReLU())
    model.add(Dropout(0.5))

    model.add(Dense(512, ))
    model.add(LeakyReLU())
    model.add(Dropout(0.5))

    model.add(Dense(256, ))
    model.add(LeakyReLU())
    model.add(Dropout(0.5))

    model.add(Dense(1, ))
    return model

def build_gan(generator, discriminator):
    model = Sequential()
    model.add(generator)
    model.add(discriminator)
    return model

# used for training the discriminator netowrk
discriminator_v = build_discriminator()
discriminator_v.compile(loss='binary_crossentropy', optimizer=Adam(), metrics=['accuracy'])

# used for training the Generator netowrk
generator_v = build_generator(len(zdim))
discriminator_v.trainable = False

# used for training the GAN netowrk
gan_v = build_gan(generator_v, discriminator_v)
gan_v.compile(loss='binary_crossentropy', optimizer=Adam())

AI isn't my area of expertise and all this is part of a much larger project, so the error may be obvious. any help will be much appreciated. [1]: https://i.stack.imgur.com/DqcjR.png



source https://stackoverflow.com/questions/72028701/gan-generator-loss-is-0-from-the-start

Comments

Popular posts from this blog

How to show number of registered users in Laravel based on usertype?

i'm trying to display data from the database in the admin dashboard i used this: <?php use Illuminate\Support\Facades\DB; $users = DB::table('users')->count(); echo $users; ?> and i have successfully get the correct data from the database but what if i want to display a specific data for example in this user table there is "usertype" that specify if the user is normal user or admin i want to user the same code above but to display a specific usertype i tried this: <?php use Illuminate\Support\Facades\DB; $users = DB::table('users')->count()->WHERE usertype =admin; echo $users; ?> but it didn't work, what am i doing wrong? source https://stackoverflow.com/questions/68199726/how-to-show-number-of-registered-users-in-laravel-based-on-usertype

Why is my reports service not connecting?

I am trying to pull some data from a Postgres database using Node.js and node-postures but I can't figure out why my service isn't connecting. my routes/index.js file: const express = require('express'); const router = express.Router(); const ordersCountController = require('../controllers/ordersCountController'); const ordersController = require('../controllers/ordersController'); const weeklyReportsController = require('../controllers/weeklyReportsController'); router.get('/orders_count', ordersCountController); router.get('/orders', ordersController); router.get('/weekly_reports', weeklyReportsController); module.exports = router; My controllers/weeklyReportsController.js file: const weeklyReportsService = require('../services/weeklyReportsService'); const weeklyReportsController = async (req, res) => { try { const data = await weeklyReportsService; res.json({data}) console...

How to split a rinex file if I need 24 hours data

Trying to divide rinex file using the command gfzrnx but getting this error. While doing that getting this error msg 'gfzrnx' is not recognized as an internal or external command Trying to split rinex file using the command gfzrnx. also install'gfzrnx'. my doubt is I need to run this program in 'gfzrnx' or in 'cmdprompt'. I am expecting a rinex file with 24 hrs or 1 day data.I Have 48 hrs data in RINEX format. Please help me to solve this issue. source https://stackoverflow.com/questions/75385367/how-to-split-a-rinex-file-if-i-need-24-hours-data