Learn Python Libraries: NumPy, Pandas, Matplotlib, Requests, and Web Scraping - Textnotes

Learn Python Libraries: NumPy, Pandas, Matplotlib, Requests, and Web Scraping


Explore essential Python libraries for data manipulation, visualization, web scraping, and API integration. Learn NumPy, Pandas, Matplotlib, Requests, BeautifulSoup, and Scrapy with examples.

Objective:

Learn popular Python libraries to perform practical programming tasks such as data manipulation, visualization, API requests, and web scraping.

Topics and Examples:

1. NumPy: Arrays and Vectorized Operations

NumPy is a library for numerical computing in Python. It provides support for multidimensional arrays and vectorized operations.

Example:


import numpy as np

# Create NumPy array
arr = np.array([1, 2, 3, 4])
print(arr) # [1 2 3 4]

# Vectorized operations
print(arr + 10) # [11 12 13 14]
print(arr * 2) # [2 4 6 8]

# 2D array
matrix = np.array([[1, 2], [3, 4]])
print(matrix)

2. Pandas: DataFrames, Series, CSV/Excel Operations

Pandas is used for data manipulation and analysis.

Example:


import pandas as pd

# Create a DataFrame
data = {'Name': ['Chinmaya', 'Rout'], 'Age': [25, 26]}
df = pd.DataFrame(data)
print(df)

# Read CSV
# df = pd.read_csv("data.csv")

# Write CSV
# df.to_csv("output.csv", index=False)

# Access columns
print(df['Name'])

# Series
ages = pd.Series([25, 26, 27])
print(ages)

3. Matplotlib / Seaborn: Data Visualization

  1. Matplotlib: Basic plotting library
  2. Seaborn: Statistical plotting, built on Matplotlib

Example:


import matplotlib.pyplot as plt
import seaborn as sns

# Matplotlib plot
x = [1, 2, 3, 4]
y = [10, 20, 25, 30]
plt.plot(x, y)
plt.title("Line Plot")
plt.xlabel("X-axis")
plt.ylabel("Y-axis")
plt.show()

# Seaborn plot
data = sns.load_dataset("tips")
sns.barplot(x="day", y="total_bill", data=data)
plt.show()

4. Requests: HTTP Requests, APIs

The Requests library allows sending HTTP requests to interact with web services and APIs.

Example:


import requests

response = requests.get("https://api.github.com")
print(response.status_code) # 200
print(response.json()) # JSON response

5. BeautifulSoup / Scrapy: Web Scraping

  1. BeautifulSoup: Parse HTML and extract data
  2. Scrapy: Advanced web scraping framework

BeautifulSoup Example:


from bs4 import BeautifulSoup
import requests

url = "https://www.example.com"
response = requests.get(url)
soup = BeautifulSoup(response.text, "html.parser")

# Extract all links
for link in soup.find_all("a"):
print(link.get("href"))

Scrapy (basic command to start a project):


scrapy startproject myproject
scrapy crawl myspider

This section covers essential Python libraries for intermediate learners, including data manipulation with NumPy and Pandas, visualization with Matplotlib/Seaborn, API interaction with Requests, and web scraping with BeautifulSoup/Scrapy.