Skip to main content

Best way to minimise/streamline/use synergies of code to populate several webpages with different content but same layout? [closed]

sorry for the non-specific title but my question relates to coding strategy and best practice.

I'm new to coding and I'm working on a website on a local server. I'm learning a lot of new coding languages (html, css, php, jquery, javascript, ajax and mysql) as I need them but I'm conscious that I'm a complate amateur when it comes to the best way to apply them.

Background

  1. My website has header and footer pages that include all the menu navigation, login system, css/js link type of things.
  2. Every other webpage leverages the header and footer pages, therefore the other webpages only contain content (I'll call these pages "article webpages" for clarity below).
  3. The layout of all the article webpages is the same but headings, images, colour schemes and text will vary from one article webpage to another.
  4. Lastly, I also have one other over-arching webpage that includes buttons and each button sends the user to an individual article webpage when clicked.

Question

Is there a way to save one webpage as a template for the layout and then each time the user clicks a button on the over-arching webpage the template populates the headings, text, images etc to almost create an article webpage on the spot (either based on the url or based on the button click.

I believe I'd need php to check the url side of things but I'm confused how to combine that with JS or Ajax to actually do the populating?

Desired Outcome / Constraints

  1. I want to make sure it happens seamlessly so that the page loads correctly the minute the button is clicked on the over-arching webpage i.e. I don't want the user to see things getting populated when they open the webpage.
  2. I thought it might make more sense to have a kind of blank template html document that's updated each time rather than saving every single article webpage. In effect it'd mean that webpages are created on the spot depending on the button the user clicked.
  3. I wasn't sure if this approach would cause issues when it comes to multiple users at the same time trying to see different articles?
  4. I don't want this proposed template approach to make the website heavy and slow if it would be faster to just save every article webpage individually.
  5. I wasn't sure if the new approach I'm thinking about may make the website less accessible to some users depending on the devices and browsers they're using? For instance if their device can't run javascript or something then they can't see the webpages at all (I don't know if that's even a thing but just giving a theorectical example).
  6. I'm open to new coding languages if it really makes more sense but I would rather not leverage out-of-the-box code templates like Bootstrap etc because I want to learn specifics.
  7. Some article webpages may have more text or images than others so the code needs to account for that by removing or adding divs or paragraph tags from the template webpage when the page is being populated.
  8. The template would need to be populated with:

a) data from the mysql database
b) images from the website folder

c) text from txt documents saved in the website folder

All help greatly appreciated and Happy New Year's Eve!! :)



source https://stackoverflow.com/questions/70544768/best-way-to-minimise-streamline-use-synergies-of-code-to-populate-several-webpag

Comments

Popular posts from this blog

ValueError: X has 10 features, but LinearRegression is expecting 1 features as input

So, I am trying to predict the model but its throwing error like it has 10 features but it expacts only 1. So I am confused can anyone help me with it? more importantly its not working for me when my friend runs it. It works perfectly fine dose anyone know the reason about it? cv = KFold(n_splits = 10) all_loss = [] for i in range(9): # 1st for loop over polynomial orders poly_order = i X_train = make_polynomial(x, poly_order) loss_at_order = [] # initiate a set to collect loss for CV for train_index, test_index in cv.split(X_train): print('TRAIN:', train_index, 'TEST:', test_index) X_train_cv, X_test_cv = X_train[train_index], X_test[test_index] t_train_cv, t_test_cv = t[train_index], t[test_index] reg.fit(X_train_cv, t_train_cv) loss_at_order.append(np.mean((t_test_cv - reg.predict(X_test_cv))**2)) # collect loss at fold all_loss.append(np.mean(loss_at_order)) # collect loss at order plt.plot(np.log(al...

Sorting large arrays of big numeric stings

I was solving bigSorting() problem from hackerrank: Consider an array of numeric strings where each string is a positive number with anywhere from to digits. Sort the array's elements in non-decreasing, or ascending order of their integer values and return the sorted array. I know it works as follows: def bigSorting(unsorted): return sorted(unsorted, key=int) But I didnt guess this approach earlier. Initially I tried below: def bigSorting(unsorted): int_unsorted = [int(i) for i in unsorted] int_sorted = sorted(int_unsorted) return [str(i) for i in int_sorted] However, for some of the test cases, it was showing time limit exceeded. Why is it so? PS: I dont know exactly what those test cases were as hacker rank does not reveal all test cases. source https://stackoverflow.com/questions/73007397/sorting-large-arrays-of-big-numeric-stings

How to load Javascript with imported modules?

I am trying to import modules from tensorflowjs, and below is my code. test.html <!DOCTYPE html> <html lang="en"> <head> <meta charset="UTF-8"> <title>Document</title </head> <body> <script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs@2.0.0/dist/tf.min.js"></script> <script type="module" src="./test.js"></script> </body> </html> test.js import * as tf from "./node_modules/@tensorflow/tfjs"; import {loadGraphModel} from "./node_modules/@tensorflow/tfjs-converter"; const MODEL_URL = './model.json'; const model = await loadGraphModel(MODEL_URL); const cat = document.getElementById('cat'); model.execute(tf.browser.fromPixels(cat)); Besides, I run the server using python -m http.server in my command prompt(Windows 10), and this is the error prompt in the console log of my browser: Failed to loa...