(If you guys (devs) do not want me scraping, please delete this, I have no ill intent, it scrapes one users downloads once a day and will not lag your site)

YOU WILL NEED TO GO IN AND REPLACE EVERY "stocktonaerospace" with your exact username!!

HOW TO RUN:
install python

open cmd prompt, run: "pip install requests beautifulsoup4"

copy script into notepad, save as junodownloadsweb.py

open cmd promt, type "python junodownloadsweb.py to generate index.html

use github or other site, create repository named "socktonaerospace-downloads" click create repository

in repository, add file --> upload file --> index.html

enable guthub pages: settings --> pages --> Deploy from branch --> main --> root --> save

copy website should be "://(github username).github.io/stocktonaerospace-downloads/

insert hyper link in juno bio

will display total downloads with 1 of 3 fun messages

SCRIPT
SAVE THIS AS junodownloadsweb.py Copy from "import --> main()" at very bottom

import requests
from bs4 import BeautifulSoup
import json
import os
import time
import random
from datetime import datetime

--- SETTINGS ---

USERNAME = "StocktonAerospace"
BASEURL = f"https://www.simplerockets.com/u/{USERNAME}"
CACHE
FILE = "downloadscache.json"
REFRESH
INTERVAL = 86400 # 1 day in seconds
HTML_FILE = "index.html"

def getuserdownloads(username):
baseurl = f"https://www.simplerockets.com/u/{username}"
page = 1
total
downloads = 0

while True:
url = f"{baseurl}?page={page}"
response = requests.get(url, headers={'User-Agent': 'Mozilla/5.0'})
if response.status
code != 200:
break

soup = BeautifulSoup(response.text, 'html.parser')
downloadspans = soup.select("span.downloads.pull-right")
if not download
spans:
break

for span in downloadspans:
text = span.get
text(strip=True)
digits = ''.join(filter(str.isdigit, text))
if digits:
total_downloads += int(digits)

page += 1

return total_downloads

def loadcache():
if os.path.exists(CACHE
FILE):
with open(CACHE_FILE, 'r') as f:
return json.load(f)
return {}

def savecache(data):
with open(CACHE
FILE, 'w') as f:
json.dump(data, f)

def generate_message(total):
"""Pick a random fun message with the total number inserted."""
messages = [
f"{total} people are very satisfied!",
f"{total} would recommend!",
f"{total} out of {total} say “super safe, didn’t crash!”"
]
return random.choice(messages)

def generatehtml(totaldownloads):
"""Create or overwrite index.html with latest data and random message"""
lastupdated = datetime.now().strftime("%B %d, %Y — %I:%M %p")
message = generate
message(total_downloads)

htmlcontent = f"""<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>{USERNAME} Downloads</title>
<style>
body {{
font-family: Arial, sans-serif;
background-color: #0b0b0b;
color: #ffffff;
text-align: center;
margin-top: 10%;
}}
a {{
color: #00aaff;
text-decoration: none;
}}
a:hover {{
text-decoration: underline;
}}
.downloads {{
font-size: 2.5em;
margin-bottom: 10px;
}}
.updated {{
font-size: 1em;
color: #aaaaaa;
margin-bottom: 20px;
}}
.message {{
font-size: 1.2em;
color: #00ffaa;
margin-top: 20px;
}}
</style>
</head>
<body>
<div class="downloads">{total
downloads} downloads</div>
<div class="updated">Last updated: {lastupdated}</div>
<a href="{BASE
URL}" target="blank">{BASEURL}</a>
<div class="message">{message}</div>
</body>
</html>"""
with open(HTMLFILE, 'w', encoding='utf-8') as f:
f.write(html
content)

def main():
cache = load_cache()

Check if cache is fresh

if USERNAME in cache and time.time() - cache[USERNAME]["timestamp"] < REFRESHINTERVAL:
total = cache[USERNAME]["downloads"]
else:
total = get
userdownloads(USERNAME)
cache[USERNAME] = {"downloads": total, "timestamp": time.time()}
save
cache(cache)

Print in console

print(f"{total} downloads — {BASE_URL}")

Generate or update the webpage

generate_html(total)

if name == "main":
main()

Tags
Idea

11 Comments

  • Log in to leave a comment
  • Profile image
    180 NOTHOSH

    @Cyberstar3964 Are you using a scraper to show your total downloads on your profile?

    1 hours ago
  • Profile image

    Who summoned me

    3 days ago
  • Profile image

    @NOTHOSH no rush at all man. Thanks and have a good day tommrow

    3 days ago
  • Profile image
    180 NOTHOSH

    @StocktonAerospace I'm gonna try to update it this week so it works how cyberstars does, i got work tomorrow then class on wednesday, then my kid til saturday when i work again. I'll try to do this at work if its slow, I'm actually really enjoying it

    3 days ago
  • Profile image

    @NOTHOSH I tried posting that from gpt and I think I most have messed it up. It’s giving me a head ache lol

    3 days ago
  • Profile image
    180 NOTHOSH

    @StocktonAerospace I just checked @Cyberstar3964
    profile, he does something similar except uses an image so it displays it instead of requiring you to click. I'm going to try to replicate that tomorrow

    3 days ago
  • Profile image
    180 NOTHOSH

    @StocktonAerospace I really don't, i did not write most of that code, I just found the html for the downloads display, added the automation function, and how it's indexed. Chatgpt did almost everything else lol

    3 days ago
  • Profile image
    180 NOTHOSH

    Add this line below to automate task

    # Schedule to run once every 24 hours
    schedule.every(24).hours.do(update_downloads)
    
    print(&quot;  Auto-refresh enabled. Running every 24 hours...&quot;)
    while True:
        schedule.run_pending()
        time.sleep(60)  # check once per minute
    

    immediately above last line

    3 days ago
  • Profile image
    180 NOTHOSH

    If you copy and paste you will get a syntax error, this is because posting the script here changes the dashes into triple dashes. Go to chat gpt and paste the script and write "Fix logical and syntax errors in python script for scraping, rewrite ready for copy and paste"

    3 days ago
  • Profile image

    @NOTHOSH this is gonna take me a second to digest lol. I didn’t know you knew this so well

    3 days ago
  • Profile image
    180 NOTHOSH

    @StocktonAerospace

    3 days ago

No Upvotes

Log in in to upvote this post.