Ready to dive into some Cool Python scripts? In this list, we’ve rounded up 20 interesting ones that are not only fun but also practical. Whether you’re a coding enthusiast or just looking for useful tools, these scripts have got you covered. Let’s check them out and see how Python can make tasks easier and more enjoyable for you!
1. Web Scraper with BeautifulSoup and Requests:
Objective: Create a simple web scraper to extract data from a website using Python.
Explore also here the top Python Libraries for Web Scraping
Code Snippet:
import requests from bs4 import BeautifulSoup url="https://example.com" response = requests.get(url) soup = BeautifulSoup(response.text, 'html.parser') # Extracting data, e.g., all the links on the page links = soup.find_all('a') for link in links: print(link.get('href'))
Explanation:
requests.get(url)
: Fetches the HTML content of the specified URL.BeautifulSoup(response.text, 'html.parser')
: Parses the HTML content.soup.find_all('a')
: Finds all anchor tags.link.get('href')
: Retrieves the href attribute of each anchor tag.
This script extracts and prints all the links from a webpage.
2. Machine Learning Model Deployment with Flask:
Objective: Build a Flask web application to deploy a machine learning model and make predictions.
Code Snippet:
from flask import Flask, request, jsonify import pickle app = Flask(__name__) # Load the trained machine learning model with open('model.pkl', 'rb') as model_file: model = pickle.load(model_file) @app.route('/predict', methods=['POST']) def predict(): data = request.get_json(force=True) prediction = model.predict([data['input']]) output = {'prediction': prediction[0]} return jsonify(output) if __name__ == '__main__': app.run(port=5000)
Explanation:
Flask
: A web framework for Python.pickle
: Used to serialize and deserialize the machine learning model.@app.route('/predict', methods=['POST'])
: Defines an endpoint for making predictions.request.get_json(force=True)
: Retrieves input data in JSON format.model.predict([data['input']])
: Uses the trained model to make predictions.
This script sets up a Flask web server that loads a pre-trained machine learning model and exposes an endpoint for making predictions.
3. Blockchain Implementation in Python:
Objective: Create a basic blockchain system to understand the fundamental concepts of blockchain technology.
Code Snippet:
import hashlib import time class Block: def __init__(self, previous_hash, data): self.timestamp = time.time() self.previous_hash = previous_hash self.data = data self.hash = self.calculate_hash() def calculate_hash(self): return hashlib.sha256( (str(self.timestamp) + str(self.previous_hash) + str(self.data)).encode('utf-8') ).hexdigest() # Create a blockchain blockchain = [Block("0", "Genesis Block")] # Add a new block new_data = "Some transaction data" previous_block = blockchain[-1] new_block = Block(previous_block.hash, new_data) blockchain.append(new_block)
Explanation:
hashlib.sha256()
: Generates a SHA-256 hash.Block
: Represents a block in the blockchain with a timestamp, data, previous hash, and current hash.calculate_hash()
: Computes the hash of the block using its attributes.
This script initializes a blockchain with a genesis block and adds a new block with some transaction data.
4. Automated Testing with Selenium:
Objective: Write Python scripts using Selenium to automate browser testing for web applications.
Code Snippet:
from selenium import webdriver # Initialize a Chrome browser driver = webdriver.Chrome() # Open a website driver.get('https://example.com') # Find an element by its ID and interact with it element = driver.find_element_by_id('some_element_id') element.click() # Perform assertions or other actions as needed # Close the browser driver.quit()
Explanation:
webdriver.Chrome()
: Initializes a Chrome browser.driver.get('https://example.com')
: Opens a website.driver.find_element_by_id('some_element_id')
: Finds an element by its ID.element.click()
: Interacts with the element (e.g., clicks a button).
This script automates browser interactions for testing web applications.
5. Network Scanner with Scapy:
Objective: Develop a network scanner using the Scapy library to discover devices on a local network.
Code Snippet:
from scapy.all import ARP, Ether, srp def scan(ip): arp_request = ARP(pdst=ip) ether = Ether(dst="ff:ff:ff:ff:ff:ff") packet = ether/arp_request result = srp(packet, timeout=3, verbose=0)[0] devices = [] for sent, received in result: devices.append({'ip': received.psrc, 'mac': received.hwsrc}) return devices ip_range = "192.168.1.1/24" devices = scan(ip_range) for device in devices: print(f"IP: {device['ip']}, MAC: {device['mac']}")
Explanation:
scapy.all
: A powerful library for packet manipulation.ARP
: Represents an ARP request packet.Ether
: Represents an Ethernet frame.srp
: Sends and receives packets at the data link layer.
This script performs an ARP scan on a specified IP range to discover devices on the local network.
6. Cryptocurrency Price Tracker with API:
Objective: Build a script that fetches real-time cryptocurrency prices using a cryptocurrency API.
Code Snippet:
import requests def get_crypto_prices(): url="https://api.coingecko.com/api/v3/simple/price" params = {'ids': 'bitcoin,ethereum,litecoin', 'vs_currencies': 'usd'} response = requests.get(url, params=params) prices = response.json() return prices crypto_prices = get_crypto_prices() for coin, value in crypto_prices.items(): print(f"{coin.capitalize()}: ${value['usd']}")
Explanation:
requests.get(url, params=params)
: Sends an HTTP GET request to the CoinGecko API with specified parameters.response.json()
: Parses the JSON response.- Prints the current prices of Bitcoin, Ethereum, and Litecoin in USD.
This script fetches and displays real-time cryptocurrency prices.
7. Data Analysis with Pandas and Matplotlib:
Objective: Analyze and visualize a dataset using Pandas for data manipulation and Matplotlib for plotting.
Code Snippet:
import pandas as pd import matplotlib.pyplot as plt # Assume 'data.csv' is your dataset df = pd.read_csv('data.csv') # Perform data analysis average_age = df['Age'].mean() total_records = len(df) # Create a bar chart plt.bar(['Average Age', 'Total Records'], [average_age, total_records]) plt.xlabel('Metrics') plt.ylabel('Values') plt.title('Basic Data Analysis') plt.show()
Explanation:
pd.read_csv('data.csv')
: Reads a CSV file into a Pandas DataFrame.df['Age'].mean()
: Calculates the average age from the ‘Age’ column.len(df)
: Gets the total number of records in the dataset.plt.bar(...)
: Creates a bar chart using Matplotlib.
This script performs basic data analysis and generates a bar chart.
8. Natural Language Processing (NLP) with NLTK:
Objective: Perform basic NLP tasks like tokenization and sentiment analysis using the Natural Language Toolkit (NLTK).
Code Snippet:
import nltk from nltk.tokenize import word_tokenize, sent_tokenize from nltk.sentiment import SentimentIntensityAnalyzer nltk.download('punkt') nltk.download('vader_lexicon') text = "NLTK is a powerful library for natural language processing. I love using it!" # Tokenization words = word_tokenize(text) sentences = sent_tokenize(text) # Sentiment analysis sid = SentimentIntensityAnalyzer() sentiment_score = sid.polarity_scores(text) print("Tokenized Words:", words) print("Tokenized Sentences:", sentences) print("Sentiment Score:", sentiment_score)
Explanation:
word_tokenize
andsent_tokenize
: Tokenize the text into words and sentences.SentimentIntensityAnalyzer
: NLTK’s tool for sentiment analysis.
This script demonstrates basic text tokenization and sentiment analysis using NLTK.
9. Python Script for Data Encryption/Decryption:
Objective: Implement a script that encrypts and decrypts data using cryptographic algorithms.
Code Snippet:
from cryptography.fernet import Fernet # Generate a key for encryption and decryption key = Fernet.generate_key() cipher_suite = Fernet(key) # Data to be encrypted data = b"Confidential information" # Encryption cipher_text = cipher_suite.encrypt(data) print("Encrypted:", cipher_text) # Decryption plain_text = cipher_suite.decrypt(cipher_text) print("Decrypted:", plain_text.decode('utf-8'))
Explanation:
Fernet
: A symmetric key encryption algorithm.Fernet.generate_key()
: Generates a key for encryption and decryption.cipher_suite.encrypt(data)
: Encrypts the data.cipher_suite.decrypt(cipher_text)
: Decrypts the data.
This script demonstrates basic data encryption and decryption using the Fernet library.
10. Creating a RESTful API with Flask:
Objective: Build a RESTful API using Flask for serving and consuming data.
Code Snippet:
from flask import Flask, jsonify, request app = Flask(__name__) # Sample data books = [ {'id': 1, 'title': 'Python Crash Course', 'author': 'Eric Matthes'}, {'id': 2, 'title': 'Fluent Python', 'author': 'Luciano Ramalho'} ] # GET request to retrieve all books @app.route('/books', methods=['GET']) def get_books(): return jsonify({'books': books}) # POST request to add a new book @app.route('/books', methods=['POST']) def add_book(): new_book = request.get_json() books.append(new_book) return jsonify({'message': 'Book added successfully'}) if __name__ == '__main__': app.run(port=5000)
Explanation:
@app.route('/books', methods=['GET'])
: Defines an endpoint for retrieving all books.@app.route('/books', methods=['POST'])
: Defines an endpoint for adding a new book.request.get_json()
: Retrieves JSON data from the request.
This script sets up a simple RESTful API for managing a list of books.
11. Image Processing with OpenCV:
Objective: Develop a script for basic image processing tasks such as edge detection or color filtering using OpenCV.
Code Snippet:
import cv2 import numpy as np # Load an image image = cv2.imread('image.jpg') # Convert the image to grayscale gray_image = cv2.cvtColor(image, cv2.COLOR_BGR2GRAY) # Perform edge detection edges = cv2.Canny(gray_image, 50, 150) # Display the original and processed images cv2.imshow('Original Image', image) cv2.imshow('Edge Detection', edges) cv2.waitKey(0) cv2.destroyAllWindows()
Explanation:
cv2.imread('image.jpg')
: Reads an image from a file.cv2.cvtColor(image, cv2.COLOR_BGR2GRAY)
: Converts the image to grayscale.cv2.Canny(gray_image, 50, 150)
: Applies edge detection using the Canny algorithm.cv2.imshow(...)
: Displays images.
This script demonstrates basic image processing using OpenCV.
12. Stock Price Prediction with Machine Learning:
Objective: Train a machine learning model to predict stock prices using historical data.
Code Snippet:
import pandas as pd from sklearn.model_selection import train_test_split from sklearn.linear_model import LinearRegression import matplotlib.pyplot as plt # Assume 'stock_data.csv' contains historical stock prices df = pd.read_csv('stock_data.csv') # Feature engineering df['Date'] = pd.to_datetime(df['Date']) df['Date'] = df['Date'].dt.strftime('%Y%m%d').astype(float) # Selecting features and target X = df[['Date']] y = df['Close'] # Split the data into training and testing sets X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42) # Train a linear regression model model = LinearRegression() model.fit(X_train, y_train) # Make predictions predictions = model.predict(X_test) # Plot the predictions against actual values plt.scatter(X_test, y_test, color="black") plt.plot(X_test, predictions, color="blue", linewidth=3) plt.xlabel('Date') plt.ylabel('Stock Price (Close)') plt.title('Stock Price Prediction') plt.show()
Explanation:
pd.read_csv('stock_data.csv')
: Reads historical stock price data from a CSV file.train_test_split
: Splits the data into training and testing sets.LinearRegression()
: Initializes a linear regression model.model.fit(X_train, y_train)
: Trains the model on the training data.model.predict(X_test)
: Makes predictions on the test data.
This script illustrates a basic stock price prediction using linear regression.
13. Automated Email Sender:
Objective: Write a script to send automated emails using the smtplib
library.
Code Snippet:
import smtplib from email.mime.text import MIMEText from email.mime.multipart import MIMEMultipart # Sender and recipient email addresses sender_email="[email protected]" recipient_email="[email protected]" # Email server configuration smtp_server="smtp.gmail.com" smtp_port = 587 smtp_username="[email protected]" smtp_password = 'your_email_password' # Message content subject="Automated Email" body = 'This is an automated email sent using Python.' # Create the MIME object message = MIMEMultipart() message['From'] = sender_email message['To'] = recipient_email message['Subject'] = subject message.attach(MIMEText(body, 'plain')) # Connect to the SMTP server and send the email with smtplib.SMTP(smtp_server, smtp_port) as server: server.starttls() server.login(smtp_username, smtp_password) server.sendmail(sender_email, recipient_email, message.as_string())
Explanation:
smtplib.SMTP
: Establishes a connection to the SMTP server.MIMEText
andMIMEMultipart
: Used for composing the email message.
This script demonstrates how to send automated emails using Python.
14. Chatbot Implementation with ChatterBot:
Objective: Create a simple chatbot using the ChatterBot library for natural language conversation.
Code Snippet:
from chatterbot import ChatBot from chatterbot.trainers import ChatterBotCorpusTrainer # Create a chatbot instance chatbot = ChatBot('MyBot') # Create a new trainer for the chatbot trainer = ChatterBotCorpusTrainer(chatbot) # Train the chatbot on English language data trainer.train('chatterbot.corpus.english') # Get a response from the chatbot response = chatbot.get_response('Hello, how are you?') print(response)
Explanation:
ChatterBot
: Represents the chatbot.ChatterBotCorpusTrainer
: Trainer for the chatbot using corpus data.trainer.train('chatterbot.corpus.english')
: Trains the chatbot on English language data.
This script sets up a chatbot and demonstrates a conversation with it.
15. Automated PDF Generator with ReportLab:
Objective: Build a script to generate PDF reports dynamically using the ReportLab library.
Code Snippet:
from reportlab.pdfgen import canvas def generate_pdf(file_name, content): # Create a PDF document pdf = canvas.Canvas(file_name) # Add content to the PDF pdf.drawString(100, 750, "Generated PDF Report") pdf.drawString(100, 730, "----------------------") pdf.drawString(100, 700, content) # Save the PDF pdf.save() # Example usage report_content = "This is a sample PDF report." generate_pdf('sample_report.pdf', report_content)
Explanation:
canvas.Canvas
: Creates a PDF canvas for drawing.pdf.drawString(x, y, text)
: Adds text to the PDF at specified coordinates.
This script generates a simple PDF report using the ReportLab library.
16. Web Scraping with Scrapy:
Objective: Develop a web scraper using the Scrapy framework to extract structured data from websites.
Code Snippet:
import scrapy class QuotesSpider(scrapy.Spider): name="quotes" start_urls = ['http://quotes.toscrape.com/page/1/'] def parse(self, response): for quote in response.css('div.quote'): yield { 'text': quote.css('span.text::text').get(), 'author': quote.css('small::text').get(), 'tags': quote.css('div.tags a.tag::text').getall(), } next_page = response.css('li.next a::attr(href)').get() if next_page is not None: yield response.follow(next_page, self.parse)
Explanation:
scrapy.Spider
: Defines a spider for web scraping.response.css(...)
: Uses CSS selectors to extract data from HTML.yield {...}
: Yields scraped data.response.follow(next_page, self.parse)
: Follows links to subsequent pages.
This script represents a basic Scrapy spider for extracting quotes from a website.
17. Twitter Bot with Tweepy:
Objective: Create a simple Twitter bot using the Tweepy library to post tweets.
Code Snippet:
import tweepy import time # Twitter API credentials consumer_key = 'your_consumer_key' consumer_secret="your_consumer_secret" access_token = 'your_access_token' access_token_secret="your_access_token_secret" # Authenticate with Twitter auth = tweepy.OAuthHandler(consumer_key, consumer_secret) auth.set_access_token(access_token, access_token_secret) api = tweepy.API(auth) # Tweet content tweet_text = "Hello, Twitter! This is my first tweet using Tweepy." # Post the tweet api.update_status(tweet_text) # Wait for 5 seconds (optional) time.sleep(5)
Explanation:
tweepy.OAuthHandler
: Authenticates with the Twitter API.api.update_status(tweet_text)
: Posts a tweet.
This script demonstrates how to create a simple Twitter bot using Tweepy.
18. Data Visualization with Plotly:
Objective: Create interactive data visualizations using the Plotly library.
Code Snippet:
import plotly.express as px import plotly.graph_objects as go # Sample data data = {'Category': ['A', 'B', 'C', 'D'], 'Values': [10, 25, 15, 30]} # Create a bar chart using Plotly Express fig1 = px.bar(data, x='Category', y='Values', title="Bar Chart") # Create a scatter plot using Plotly Graph Objects fig2 = go.Figure() fig2.add_trace(go.Scatter(x=data['Category'], y=data['Values'], mode="markers", name="Scatter Plot")) # Show the visualizations fig1.show() fig2.show()
Explanation:
plotly.express
: Provides a high-level interface for creating visualizations.plotly.graph_objects
: Allows more customization of visualizations.
This script showcases creating a bar chart with Plotly Express and a scatter plot with Plotly Graph Objects.
19. GUI Application with Tkinter:
Objective: Develop a simple GUI application using the Tkinter library.
Code Snippet:
import tkinter as tk def on_button_click(): label.config(text="Hello, " + entry.get()) # Create the main window window = tk.Tk() window.title("Simple Tkinter App") # Create a label label = tk.Label(window, text="Enter your name:") label.pack() # Create an entry widget entry = tk.Entry(window) entry.pack() # Create a button button = tk.Button(window, text="Click me", command=on_button_click) button.pack() # Run the application window.mainloop()
Explanation:
tkinter
: The standard GUI toolkit for Python.tk.Label
,tk.Entry
,tk.Button
: Widgets for creating labels, entry fields, and buttons.window.mainloop()
: Starts the event loop for the GUI.
This script demonstrates a simple Tkinter application with an entry field, button, and label.
20. Flask Web Application with Database Integration:
Objective: Build a web application using Flask and integrate a SQLite database.
Code Snippet:
from flask import Flask, render_template, request, redirect, url_for from flask_sqlalchemy import SQLAlchemy app = Flask(__name__) app.config['SQLALCHEMY_DATABASE_URI'] = 'sqlite:///mydatabase.db' db = SQLAlchemy(app) # Define a data model class User(db.Model): id = db.Column(db.Integer, primary_key=True) username = db.Column(db.String(80), unique=True, nullable=False) # Create the database db.create_all() @app.route('/') def index(): users = User.query.all() return render_template('index.html', users=users) @app.route('/add_user', methods=['POST']) def add_user(): username = request.form['username'] new_user = User(username=username) db.session.add(new_user) db.session.commit() return redirect(url_for('index')) if __name__ == '__main__': app.run(debug=True)
Explanation:
Flask
: Web framework for Python.Flask_SQLAlchemy
: Extension for integrating SQLAlchemy with Flask.User(db.Model)
: Data model representing a user in the database.db.create_all()
: Creates the database tables.
This script sets up a simple Flask web application with SQLite database integration.
Wrapping Up
In this Python journey, you’ve explored a variety of scripts covering diverse applications—from web scraping to machine learning, data analysis, automation, and web development. Whether you’re crafting chatbots, predicting stock prices, or creating web applications, these scripts showcase the versatility and power of Python. Now equipped with a diverse set of tools, you’re ready to tackle an array of real-world challenges. Happy coding!