If a company website suddenly redirects to an unfamiliar page, or search engines display completely irrelevant titles or descriptions, it indicates that the website may have been hijacked. Investigation and repair are necessary. Most hijacked websites can be restored to their original state. The first step after discovering the anomaly is to isolate the risk.
If you can log into the server, quickly put the website into maintenance mode. If using a content management system such as WordPress, enable the built-in maintenance function, or create a simple `index.html` page in the website root directory to inform users that the website is under maintenance. A more direct isolation method is to temporarily change the name of the website root directory on the server, or temporarily redirect the entire site to a static notice page through the web server's (such as Nginx or Apache) configuration file. This prevents users from continuing to access the tampered page, avoids the negative impact from spreading, and also buys time for investigation.
Next, it is necessary to accurately determine the type and entry point of the hijacking. Hijacking usually occurs through several main channels. The most common is the tampering of server files. Hackers may have uploaded backdoor files through vulnerabilities or directly modified the website's source code. You need to immediately check the website files on the server. Pay close attention to entry files in the root directory, such as `index.php`, `index.html`, and `.htaccess` (Apache server), to check for the insertion of unfamiliar, highly obfuscated JavaScript code or iframe redirection code. Using command-line tools for quick troubleshooting is highly efficient. For example, in the website root directory of a Linux server, you can run the following command to quickly find files modified within the last 7 days:
find . -type f -mtime -7 -name "*.php" | head -20
Alternatively, search for files containing typical hijacking code (such as `eval(`, `base64_decode`, `fromCharCode`, `iframe`, `window.location`):
`grep -r "eval\(base64_decode\|fromCharCode\|iframe\|window.location" . --include="*.php"` --include="*.js"
Another common form of hijacking is DNS hijacking or domain name poisoning. Use webmaster tools or a global DNS lookup website to check if your domain is being resolved to an unfamiliar IP address. If DNS records have been modified without authorization, you need to contact your domain registrar immediately to verify your account information and correct the DNS records back to the correct IP address of your server as soon as possible, while also enabling two-factor authentication for your account.
Another possibility is search engine hijacking. The website content itself appears normal, but the titles and descriptions indexed by search engines become spam advertisements. This is usually achieved by verifying the visitor's User-Agent... The problem is as follows: When users access the site directly, the content appears normal. However, when access is detected from Baidu or Google crawlers, a different set of content containing spam keywords is returned. Checking server logs or using a search engine's "crawl by URL" tool to simulate a crawl can help detect this deception.
After locating the root cause, begin a thorough cleanup. If file tampering is confirmed, the most reliable method is to restore from a clean backup. Ensure that the backup file used is an absolutely clean version created before the website was compromised. Never use backups that may have been recently infected.
If a clean backup is unavailable, manual cleanup is required. This includes:
1. 1. Delete all suspicious files: Not only tampered webpage files, but also carefully search for files with seemingly normal names (such as `wp-config.php.bak`, `cache.php`) or hidden WebShell backdoor files.
2. Thoroughly update: Update all CMS cores, themes, and plugins to the latest versions. Vulnerabilities in older versions are the most common entry points for hackers.
3. Reset all passwords: Including server SSH/FTP passwords, database passwords, website backend administrator passwords, etc. Ensure that each password is strong and unique.
4. Database Review: Hackers may inject malicious code into fields such as post content and option tables in the database. Careful screening is necessary.
During the cleanup process, it is crucial to avoid falling into a "cat-and-mouse game." Some malicious code has the ability to reinfect repeatedly; if backdoors are not completely removed, the website can quickly be compromised again. Therefore, close monitoring of file integrity is essential after cleanup. File integrity monitoring tools or simple scripts can be used to record the hash values of critical files and compare them regularly.
After the website is back online, strengthening security defenses is key to preventing recurrence. First, plug the entry points for vulnerabilities:
Strengthen credential security: Eliminate the use of weak passwords and enable two-factor authentication for all administrative backends.
Maintain updates: Establish mechanisms to ensure that the server operating system, web service software, PHP/Node.js and other runtime environments, as well as all application code, are promptly updated with security patches. .
The principle of least privilege: The user running the web server (e.g., `www-data`) should only have read and necessary write permissions for website files, and should not have system-level permissions.
Secondly, deploy proactive protection:
Configure a Web Application Firewall: Whether it's a cloud WAF or a server-level software WAF, it can effectively block common injection and cross-site scripting attacks.
Modify default settings: Change the default backend address, database table prefix, etc., of the CMS.
Restrict access: Through `.htaccess` or Nginx configuration, restrict access to the backend management page to only company or home IP addresses.
Finally, restore trust and reputation. Website hijacking, especially after being marked as a "dangerous website" by search engines like Google, takes time to restore rankings. You need to:
Submit for resubmission for review: On Google After confirming the website has been thoroughly cleaned up in Search Console and Baidu Webmaster Tools, submit security questions and a request for re-review.
Update Sitemap: Generate and submit a new sitemap to guide search engines to recrawl the site.
Add Positive Signals: Continuously publish high-quality original content and acquire high-quality backlinks through legal channels to send positive signals to search engines that the website has recovered.
Maintaining detailed records throughout the entire process is crucial: the time of the intrusion, the anomalies observed, the investigation steps, the malicious code samples found, the cleanup operations, and the hardening measures implemented.
EN
CN