Overview
The SEO & Accessibility Checker is a free tool that provides comprehensive website analysis including SEO performance, accessibility compliance, and actionable recommendations. The tool scans all pages of a website, identifies issues, and provides detailed reports with scores and fixes.
Web Interface:
https://330hosting.com/seo-checker/Features
- Comprehensive Analysis: Scans multiple pages from sitemap or single page analysis
- SEO Scoring: Provides SEO score (0-100) with detailed breakdown
- Accessibility Scoring: WCAG compliance checking with accessibility score (0-100)
- Performance Scoring: Page speed and optimization analysis
- Issue Detection: Identifies critical issues and warnings
- Technical SEO: Checks sitemap.xml and robots.txt
- Search Preview: Shows how your site appears in Google search results
- Actionable Fixes: Provides specific recommendations and code fixes
Using the Web Interface
Step 1: Access the Tool
Navigate to https://330hosting.com/seo-checker/
Step 2: Enter Website URL
In the "Check Your Website" section, enter your website URL. You can enter:
- Full URL:
https://example.com - Domain only:
example.com(https:// will be added automatically) - With path:
https://example.com/about
Step 3: Analyze
- Click the "Check SEO" button
- Wait for the analysis to complete (may take 1-5 minutes depending on site size)
- Review the results in the following sections:
- Overall SEO Score: Circular progress indicator with overall score
- Score Breakdown: Individual scores for SEO, Accessibility, and Performance
- Issues Summary: Count of critical issues and warnings
- Technical SEO Status: Sitemap and robots.txt information
- Search Results Preview: How your site appears in Google
- Tabs: Overview, Pages, and All Issues
Step 4: Review Results
- Overview Tab: Summary of scores and key issues
- Pages Tab: Detailed analysis of each page scanned
- All Issues Tab: Complete list of all issues found across all pages
API Documentation
API Endpoint
Endpoint: https://330hosting.com/api/seo-check
Method: GET
Parameters:
url(required): The website URL to analyzeformat(optional): Response format -json(recommended)scanSitemap(optional): Set totrueto scan all pages from sitemap.xmlscanNestedSitemaps(optional): Set totrueto also scan nested sitemap index filesmaxPages(optional): Maximum number of pages to scan when using sitemap (default: 500)
GET https://330hosting.com/api/seo-check?url=https://example.com&format=json&scanSitemap=trueNote: All analysis is performed locally using advanced algorithms. No external dependencies or third-party APIs are used.
Request Examples
cURL
# Single page scan
curl "https://330hosting.com/api/seo-check?url=https://example.com&format=json"
# Scan all pages from sitemap
curl "https://330hosting.com/api/seo-check?url=https://example.com&format=json&scanSitemap=true&maxPages=50"JavaScript (Fetch API)
async function checkSEO(url, scanSitemap = false) {
try {
const params = new URLSearchParams({
url: url,
format: 'json'
});
if (scanSitemap) {
params.append('scanSitemap', 'true');
}
const response = await fetch(
`https://330hosting.com/api/seo-check?${params.toString()}`
);
if (!response.ok) {
throw new Error(`HTTP error! status: ${response.status}`);
}
const data = await response.json();
return data;
} catch (error) {
console.error('Error checking SEO:', error);
return null;
}
}
// Usage - single page
const results = await checkSEO('https://example.com');
console.log('SEO Score:', results.summary?.seoScore);
// Usage - full site scan
const siteResults = await checkSEO('https://example.com', true);
console.log('Pages scanned:', siteResults.pagesScanned);Python
import requests
def check_seo(url, scan_sitemap=False, max_pages=500):
api_url = "https://330hosting.com/api/seo-check"
params = {
"url": url,
"format": "json"
}
if scan_sitemap:
params["scanSitemap"] = "true"
params["maxPages"] = max_pages
try:
response = requests.get(api_url, params=params, timeout=300)
response.raise_for_status()
return response.json()
except requests.exceptions.RequestException as e:
print(f"Error checking SEO: {e}")
return None
# Usage - single page
results = check_seo("https://example.com")
if results and results.get("success"):
print(f"SEO Score: {results['summary']['seoScore']}")
print(f"Accessibility Score: {results['summary']['accessibilityScore']}")
print(f"Performance Score: {results['summary']['performanceScore']}")
# Usage - full site scan
site_results = check_seo("https://example.com", scan_sitemap=True)
if site_results:
print(f"Pages scanned: {site_results.get('pagesScanned', 0)}")Response Format
Success Response
{
"success": true,
"summary": {
"seoScore": 89,
"accessibilityScore": 92,
"performanceScore": 85,
"criticalIssues": 3,
"warnings": 12
},
"pagesScanned": 15,
"sitemapInfo": {
"found": true,
"url": "https://example.com/sitemap.xml",
"pageCount": 15,
"valid": true
},
"robotsInfo": {
"found": true,
"url": "https://example.com/robots.txt",
"valid": true,
"hasSitemap": true,
"allowsAll": true
},
"pages": [...],
"seo": {...},
"accessibility": {...}
}Error Response
{
"success": false,
"error": "Unable to fetch the website. Please check that the URL is correct and accessible."
}Response Fields
Summary Object
seoScore(number): SEO score from 0-100accessibilityScore(number): Accessibility score from 0-100performanceScore(number): Performance score from 0-100criticalIssues(number): Count of critical issues foundwarnings(number): Count of warnings found
SitemapInfo Object
found(boolean): Whether sitemap.xml was foundurl(string): URL of the sitemappageCount(number): Number of URLs in the sitemapvalid(boolean): Whether the sitemap is valid XML
RobotsInfo Object
found(boolean): Whether robots.txt was foundurl(string): URL of robots.txtvalid(boolean): Whether robots.txt is validhasSitemap(boolean): Whether robots.txt references a sitemapallowsAll(boolean): Whether robots.txt allows all crawlers
Issue Object
type(string): Issue type identifierseverity(string):"critical"or"warning"page(string): URL of the page with the issuedescription(string): Description of the issuefix(string, optional): Recommended fix
Error Handling
400 Bad Request
- Missing
urlparameter - Invalid URL format
500 Internal Server Error
- Website could not be accessed
- Network errors
- Server-side processing errors
504 Gateway Timeout
- Website took too long to process
- Site may be too large or slow
Best Practices
- URL Format: Always use full URLs with protocol (
https://example.com) - Timeout Handling: Set appropriate timeouts (5 minutes recommended)
- Error Handling: Always check
successfield before accessing results - Pagination: For large sites, consider using
maxPagesto limit analysis - Caching: Cache results when possible to avoid repeated scans
- Respectful Usage: Don't spam the API with rapid requests
Use Cases
- Website Audits: Regular SEO and accessibility audits
- CI/CD Integration: Automated checks in deployment pipelines
- Monitoring: Scheduled checks to track SEO health over time
- Client Reports: Generate SEO reports for clients
- Competitor Analysis: Analyze competitor websites
- Pre-Launch Checks: Verify SEO before launching new sites
Support
For questions, issues, or feature requests:
- Email: support@330hosting.com
- Website: https://330hosting.com