Difference between revisions of "Research Computing Configuration"
Tags: Mobile web edit Mobile edit |
|||
Line 1,883: | Line 1,883: | ||
multitail -cS apache -ev "Bot" /var/log/apache2/access.log -ci white -e "Bot" -I /var/log/apache2/access.log | multitail -cS apache -ev "Bot" /var/log/apache2/access.log -ci white -e "Bot" -I /var/log/apache2/access.log | ||
multitail -cS apache -ev "Bot" -ev "bot" -ev "internal dummy connection" /var/log/apache2/access.log | multitail -cS apache -ev "Bot" -ev "bot" -ev "internal dummy connection" /var/log/apache2/access.log | ||
+ | |||
+ | ====Traceroute==== | ||
+ | |||
+ | apt install traceroute | ||
+ | |||
+ | Note: [https://zmap.io/ Zmap] seems popular nowadays, based on traffic logs. | ||
==Old machines== | ==Old machines== |
Revision as of 15:39, 8 February 2021
This page describes the configuration of the new research computing machines: Father (Windows Server 2019) and Mother (Ubuntu Server 20.04). Note that the RDP Software Configuration describes the software installed on Father.
The hardware description and complete build notes and configuration information for Bastard, our blisteringly fast, multi-GPU, A.I. estimation platform, are on the DIGITS DevBox page. The hardware descriptions for Father and Mother are on the Research Computing Hardware page.
See also: Recovering Astarte for notes on moving content from an old mediawiki installation to a new one.
Contents
- 1 Both machines (Father and Mother)
- 2 RDP Server (Father)
- 3 Dbase Server (Mother)
- 3.1 Partitioning
- 3.2 Standard Packages
- 3.3 Samba
- 3.4 PostgreSQL
- 3.5 Mediawiki
- 3.6 Wordpress
- 3.7 Other Web Server
- 3.8 Nvidia
- 3.9 Other
- 3.10 To do
- 3.11 Mediawiki Redux
- 3.12 Upgrade mediawiki
- 3.13 Update Linux
- 3.14 Wordpress
- 3.15 Upgrading Linux Distro
- 3.16 Wordpress Redux
- 3.17 SEO
- 3.18 HTTPS
- 3.19 Install VSFTPD
- 3.20 Final Configuration Changes to Apache
- 3.21 Useful tools
- 4 Old machines
Both machines (Father and Mother)
Fan Control
Unless you want to go insane from the sound of fans cycling full on and off, you'll want to fix the IPMI fan settings. These are stored in the BIOS but not accessible through the BIOS screens. Instead, you should connect to the boxes by BMC over the network. The BMC IPv4 address is displayed during POST.
To read all about IPMI, see https://www.supermicro.com/products/nfo/IPMI.cfm I also found these helpful:
- https://blog.pcfe.net/hugo/posts/2018-08-14-epyc-ipmi-fans/
- https://calvin.me/quick-how-to-decrease-ipmi-fan-threshold/
Note the that default BMC username and password is ADMIN and ADMIN. You can download the SMCIPMITool (2.21.0_build.181029) and do the following, though I couldn't work out how to send manual configuration instructions using it:
.\SMCIPMITool.exe 192.168.2.80 ADMIN ADMIN ipmi fan .\SMCIPMITool.exe 192.168.2.80 ADMIN ADMIN ipmi fan 0
The trick is to change the thresholds for the fans, especially the lower threshold. On a linux box:
sudo apt-get install ipmitool
- Reset the BMC if you've screwed it up (or if your fans are full on all the time)
- Check that the current mode is optimal (2)
- Take a look at the sensor multiple times to see that the fan is hitting the constraint (run multiple times to coincide with different sound levels)
- Reset the lower thresholds on the fans
- Enjoy a perfectly reasonable fan speed that doesn't fluctuate unduly
ipmitool -I lan -U ADMIN -P ADMIN -H 192.168.2.80 raw 0x3c 0x40 ipmitool -I lan -U ADMIN -P ADMIN -H 192.168.2.80 raw 0x30 0x45 0x00 ipmitool -I lan -U ADMIN -P ADMIN -H 192.168.2.80 sensor ipmitool -I lan -U ADMIN -P ADMIN -H 192.168.2.80 sensor thresh FAN3 lower 120 220 320
Note that the thresholds are derived from -20% off 400 (min as per Noctua spec), then same -100, then same -200.
For the ARCTIC F8 PWM, the min is something like 250 at 5v (see https://www.arctic.ac/us_en/arctic-f8-pwm.html). I therefore used 100, 150, 200, which seemed to stabilize the fans nicely at a 300rpm idle on one box and 400rpm on the other.
Note that the BMC IP for mother often turns up on 192.168.2.70. You can see the IP address of the localhost by running:
ipmitool lan print
Bios Settings
Because we want the NVMe drives to be bootable, we need to use (U)EFI for both machines:
- Change CPU1 Slot 1 and 2 to EFI (from Legacy)
- Change Onboard Video OPROM to EFi (from Legacy)
- LAN device to EFI
- Boot to EFI
- Priority to onboard: auto
- Boot install order CD above drive above Shell
This ultimately seems to be a problem for unsigned Nvidia drivers in Linux, which make putting a GPU into the dbase server box a major issue. I tried disabling validation and adding a key to secure boot (not sure that it happened) but nothing I could do would fix the resulting driver issue.
mokutil --disable-validation sudo update-secureboot-policy --new-key sudo update-secureboot-policy --enroll-key
RDP Server (Father)
The RDP server runs Windows 2019 Server. It installs directly off the media on to the NVMe drive. Don't worry about the RAID array during the installation, we do that later.
After installation:
- Set computer name
- Storage Pool
- RAID 1 - Mirroring
- Active Directory Controller
- Remote Desktop Connection
- RD Connection Broker
- RD Licensing
- RD Session Host
- RD Vizualization Host
Change password complexity requirements: https://blog.tiga.tech/disable-the-password-complexity-for-active-directory-on-a-domain-controller/
Update
The GPU was removed from the RDP and the chipset drivers from Supermicro were installed (files are in E:/installs/drivers). This addressed all the device issues.
The following software was uninstalled:
- CUDA Development 10.1
- CUDA Documentation 10.1
- CUDA Samples 10.1
- CUDA Visual Studio Integration 10.1
- NVIDIA GeForce Experience 3.18.0.94
- NVIDIA Graphics Driver 419.67
- NVIDIA HD Audio Diver 1.3.38.13
- NVIDIA Nsight Compute v2019.1
- NVIDIA Nsight Systems v2018.3.3
- NVIDIA Nsight Visual Studio Edition
- NVIDIA PhysX System Software 9.12.0218
- NVIDIA Toos Extension SDK
- NVIDIA USBC Driver 1.1.27.831
Dbase Server (Mother)
The dbase server runs Ubuntu 18.04. You can mostly follow the instructions at https://www.pugetsystems.com/labs/hpc/The-Best-Way-To-Install-Ubuntu-18-04-with-NVIDIA-Drivers-and-any-Desktop-Flavor-1178/
It is important to use the alternative installation iso, not the default cloud based one. However, you can set up the partitions using the standard live CD, which is more user friendly for partitioning.
Partitioning
Partition as:
- 400G NVMe as ext4 and / (create new using whole drive, automatically sets aside 512m for /boot/efi)
- 1.2T as ext4 and /data (do it manually)
- RAID 10 for HDDs, all active then format at ext4 and mount at /bulk (DO NOT PARTITION! Just set up the software RAID and format.)
Standard Packages
Install the following straight from the media:
- LAMP Server
- Mail Server
- Postfix - Internet Site
- MailName: mother.edegan.com
- PostgreSQL
- Samba
- OpenSSH
Then:
sudo apt-get install tasksel tasksel -> Ubuntu Desktop
A useful apt cheatsheet is: https://www.acpsd.net/site/handlers/filedownload.ashx?moduleinstanceid=53976&dataid=56016&FileName=Ubuntu%20Cheat%20Sheet.pdf
Get the system up to date:
apt-get update apt-get upgrade
Samba
This guide is helpful: https://linuxconfig.org/how-to-configure-samba-server-share-on-ubuntu-18-04-bionic-beaver-linux
Check samba is running
samba --version
Then fix the conf file:
cp /etc/samba/smb.conf /etc/samba/smb.conf.bak vi /etc/samba/smb.conf workgroup=mothergroup usershare allow guests = no ;comment [printers] and [print$] [bulk] comment = Bulk RAID Array path = /bulk browseable = yes create mask= 0775 directory mask = 0775 read only = no guest ok = no
Test the parameters, change the permissions and ownership:
testparm /etc/samba/smb.conf chmod 770 /bulk groupadd smbusers chown :smbusers /bulk
Now add the researcher account, and add it to the samba share
groupadd -g 1001 researcher useradd -g researcher -G smbusers -s /bin/bash -p 1234 -d /home/researcher -m researcher passwd researcher hint: littleamount smbpasswd -a researcher
Finally restart samba:
systemctl restart smbd
Check it works:
smbclient -L localhost (no root password)
And add users to the samba group:
useradd ed smbusers
PostgreSQL
This guide is helpful: https://linuxconfig.org/install-postgresql-on-ubuntu-18-04-bionic-beaver
Test it!
psql --help ss -nlt postgres is listening on 543
Back up the config file and try a manual launch:
cp /etc/postgresql/10/main/postgresql.conf /etc/postgresql/10/main/postgresql.conf.bak mkdir /data/postgres chown postgres:postgres postgres su postgres cd /usr/lib/postgresql/10/bin ./initdb -D /data/postgres
Now, tune the database server. See https://www.postgresql.org/docs/10/runtime-config-resource.html and https://wiki.postgresql.org/wiki/Tuning_Your_PostgreSQL_Server
vi /etc/postgresql/10/main/postgresql.conf data_directory = '/data/postgres' #custom 1.2Tb NVME SSD listen_addresses = '*' max_connections = 10 shared_buffers = 128GB huge_pages = try # on, off, or try temp_buffers = 8GB # min 800kB work_mem = 4GB # min 64kB maintenance_work_mem = 64GB # min 1MB max_stack_depth = 6MB max_wal_senders = 5 #MUST SET THIS TO BE LESS THAN max_connections effective_cache_size = 384GB
Now fix the hba.conf file for access rights:
cp /etc/postgresql/10/main/pg_hba.conf /etc/postgresql/10/main/pg_hba.conf.bak vi /etc/postgresql/10/main/pg_hba.conf change local all all peer to local all all trust
Don't do this bit for now: local all postgres peer local all postgres md5
And restart!
service postgresql restart ss -nlt
If postgres isn't listening then it didn't start despite the message at
service postgresql status
Then we note that Postgres bins are in in:
cd /usr/lib/postgresql/10/bin
Check:
cat /etc/init.d/postgresql cat /usr/share/postgresql-common/init.d-functions
To diagnose error do manual start as postgres from /usr/lib/postgresql/10/bin:
pg_ctl -w -D /data/postgres -o '--config-file=/etc/postgresql/10/main/postgresql.conf.new' start
To check it is working:
ss -nlt psql postgres SHOW data_directory; SHOW work_mem;
When it is all ok, you can go back to using
service postgresql restart (it should take a few secs)
Make the researcher user!
createuser --interactive researcher
Add Extensions
Finally, add some extentions
apt-get install postgresql-plperl-10 apt-get install postgresql-plpython-10 apt-get install postgresql-10-plr apt-get install postgresql-10-postgis-2.4 apt-get install postgresql-10-postgis-scripts apt-get install postgis apt-get install postgis-gui
As postgres:
psql template1 CREATE EXTENSION plr; CREATE EXTENSION plperl; CREATE EXTENSION plpythonu
Mediawiki
We had a back off of an old MySQL mediawiki dbase and the contents of the mediawiki directory, and we wanted to restore the old wiki. This is what we did.
Restore the old dbase:
cd /bulk/mcnair/Web/mysqldump dbase is mcnair mysql -h localhost < web_mysqldump_backup_Fri_Aug_24_15_35_47_2018.sql
Connect to MySQL and check what we have:
mysql connect show databases; use mcnair; show tables;
If you need to: systemctl stop mysql systemctl start mysql
Get the old install:
cd home/home/mcnair/Downloads/ tar -xvzf mediawiki-1.26.2.tar.gz
Add php-xml
apt-get install php-xml apachectl restart
We then had to fix the passwords in the dbase:
#Note: change passwords from hints before running mysql SELECT User, Host, Password FROM mysql.user; UPDATE mysql.user SET Password = PASSWORD('tsn') WHERE User = 'root'; UPDATE mysql.user SET Password = PASSWORD('tsn') WHERE User = 'debian-sys-maint'; UPDATE mysql.user SET Password = PASSWORD('tsn') WHERE User = 'mcnair_wp'; FLUSH PRIVILEGES;
At this point, the basics are working, so go to http://192.168.2.92/mediawiki/mw-config/index.php and fill it out as per the old instructions (see Test Web Server Documentation and Web Server Documentation
Now overwrite LocalSettings.php with the old configuration:
cd /home/ed/Downloads/ mv LocalSettings.php /var/www/html/mediawiki/
Fix the requirements for mediawiki
apt-get install php-xml apachectl restart
Allow short URLS, so enable mod-rewrite (if not already enabled)
a2enmod rewrite systemctl restart apache2
Now fix the apache conf file
cp /etc/apache2/sites-available/000-default.conf /etc/apache2/sites-available/000-default.conf.bak vi /etc/apache2/sites-available/000-default.conf Alias /wiki /var/www/html/mediawiki/index.php #Enable the rewrite engine RewriteEngine On #Rewrite / to Main Page RewriteRule ^/*$ %{DOCUMENT_ROOT}/mediawiki/index.php [L] service restart apache2
Now create phpinfo page for debugging
cd /var/www/html vi phpinfo.php <?php echo phpinfo(); ?> #Browse to 192.168.2.92/phpinfo.php Shows Phar is installed and running Shows log is /var/log/apache2
Check pcntl is enabled
php --ri pcntl #Note that some pcntl functions are listed as disabled in phpinfo.php dpkg -s snmp apt-get install snmp
The Big Try
The process is as follows:
- Move the contents of /var/lib/mediawiki to somewhere else
- Move the contents of /bulk/mcnair/Web/www/var/www/html/mediawiki in
- Drop databases
- Restore databases
- Pray
Or more specifically:
mv /var/lib/mediawiki/ /var/lib/firstmediawikitry mkdir /var/lib/mediawiki cp -r /bulk/mcnair/Web/www/var/www/html/mediawiki /var/lib/ #Change password for dbase in LocalSettings.php mysql -p DROP DATABASE mcnair; DROP DATABASE wordpress; cd /bulk/mcnair/Web/mysqldump mysql -p -h localhost < web_mysqldump_backup_Fri_Aug_24_15_35_47_2018.sql apachectl restart
When we did this, we got a blank page! Don't panic.
cd /var/log/apache2 cat error.log apt-get install php7.2-mbstring apachectl restart
Fix LocalSettings.conf again
change domain name, contact detail, etc. Need to fix mail... Also left GoogleAnalytics extension loaded for now... Change enable_semantics IP address
Fix the Infoboxes:
chmod a+x /var/libr/mediawiki/extensions/Scribuntu... chcon -t httpd_sys_script_exec_t /var/lib/mediawiki/Scribuntu... mv /etc/apache2/sites-available/000-default.conf /etc/apache2/sites-available/000-default.conf.new mv /etc/apache2/sites-available/000-default.conf.bak /etc/apache2/sites-available/000-default.conf apachectl restart mv /etc/apache2/sites-available/000-default.conf.new /etc/apache2/sites-available/000-default.conf apachectl restart
We were left with a problem were the page would never finish loading. This turned out to be a problem with the fonts in the Vector skin, which we had previously customized. We tried to fix the problem in the dbase as below but to no avail.
SELECT * FROM externallinks WHERE el_id=2599; UPDATE externallinks SET el_to = 'http://192.178.2.92/mediawiki/resources/assets/fonts/OpenSans-Regular.ttf' WHERE el_id = 2720; UPDATE externallinks SET el_index = 'http://192.178.2.92/mediawiki/resources/assets/fonts/OpenSans-Regular.ttf' WHERE el_id = 2720; UPDATE externallinks SET el_to = 'http://192.178.2.92/mediawiki/resources/assets/fonts/OpenSans-Italic.ttf' WHERE el_id = 2721; UPDATE externallinks SET el_index = 'http://192.178.2.92/mediawiki/resources/assets/fonts/OpenSans-Italic.ttf WHERE el_id = 2721; UPDATE externallinks SET el_to = 'http://192.178.2.92/mediawiki/resources/assets/fonts/OpenSans-Bold.ttf ' WHERE el_id = 2722; UPDATE externallinks SET el_index = 'http://192.178.2.92/mediawiki/resources/assets/fonts/OpenSans-Bold.ttf ' WHERE el_id = 2722; UPDATE externallinks SET el_to = 'http://192.178.2.92/mediawiki/resources/assets/fonts/OpenSans-BoldItalic.ttf WHERE el_id = 2723; UPDATE externallinks SET el_index = '192.178.2.92/mediawiki/resources/assets/fonts/OpenSans-BoldItalic.ttf' WHERE el_id = 2723; UPDATE externallinks SET el_to = 'http://192.178.2.92/mediawiki/resources/assets/fonts/BonvenoCF-Light.otf' WHERE el_id = 2724; UPDATE externallinks SET el_index = 'http://192.178.2.92/mediawiki/resources/assets/fonts/BonvenoCF-Light.otf' WHERE el_id = 2724; UPDATE externallinks SET el_to = 'http://192.178.2.92/mediawiki/resources/assets/fonts/franklin-gothic-book.ttf' WHERE el_id = 2739; UPDATE externallinks SET el_index = 'http://192.178.2.92/mediawiki/resources/assets/fonts/franklin-gothic-book.ttf' WHERE el_id = 2739; UPDATE externallinks SET el_to = 'http://192.178.2.92/wiki/Carried_Interest_Debate' WHERE el_id = 2599; UPDATE externallinks SET el_index = 'http://192.178.2.92/wiki/Carried_Interest_Debate' WHERE el_id = 2599;
What did work was:
cd /var/www/html/mediawiki/skins diff -r Vector VectorBackup cp -r Vector/ VectorFromMcNair vi /skins/Vector/variables.less
replace all font-family statements with "Linux Libertine", Georgia, Times, sans-serif, serif;
When I rebooted the MySQL database was inaccessible for reasons unknown... I followed this page: https://www.howtoforge.com/setting-changing-resetting-mysql-root-passwords
service mysql stop mysqld_safe --skip-grant-tables & mkdir -p /var/run/mysqld chown mysql:mysql /var/run/mysqld mysqld_safe --skip-grant-tables &
And then all was good!
Confirm Account
We had some issues with some of the extensions, particularly confirm account. To debug the wiki add this line to LocalSettings.php
$wgShowExceptionDetails = true;
Fix the confirm account, see https://www.mediawiki.org/wiki/Extension:ConfirmAccount#Minimal_settings and add lines to LocalSettings.conf.
apt-get install php7.2-dev apt-get install php-pear pear pear -version pear config-get php_dir /usr/share/php phpinfo returns: /etc/php/7.2/apache2/php.ini
Then make check_pear.php as per http://pear.php.net/manual/en/installation.checking.php. It should return bool true
Finally:
php -c /etc/php/7.2/apache2/php.ini -r 'echo get_include_path()."\n";'
.:/usr/share/php
uncomment ; UNIX: "/path1:/path2" include_path = ".:/usr/share/php"
See https://www.mediawiki.org/wiki/Manual:$wgSMTP
Wordpress
From Test Web Server Documentation it doesn't look like we had to install anything before we installed Wordpress. The restoration plan is therefore:
- Restore the dbase (done already when we restored the mediawiki dbase)
- Copy over all of the wordpress files
- Create a /blog alias in apache
- Check the permissions and pray
So we did:
cd /bulk/mcnair/Web/www/var/www/html cp -r ./blog/ /var/www/html/blog cd /var/www/html/blog/
Check Apache2.conf
vi /etc/apache2/apache2.conf looks fine
Temporarily force some settings
vi wp-config.php define('WP_HOME','http://www.edegan.com/blog'); define('WP_SITEURL','http://www.edegan.com/blog'); define('DB_PASSWORD', 'tsn');
Make some changes to the dbase
mysql connect wordpress; SELECT ID, user_login, user_pass FROM wp_users; UPDATE wp_users SET user_pass=MD5('newstrongpassword') WHERE ID = 4; select * from wp_options where option_name='siteurl'; select * from wp_options where option_name='home'; UPDATE wp_options SET option_value='http://www.edegan.com/blog' WHERE option_name='siteurl'; UPDATE wp_options SET option_value='http://www.edegan.com/blog' WHERE option_name='home';
Now you can comment out the WP_HOME and WP_SITEURL settings in wp-config.php and change them (if you want) from the wp-admin interface: http://www.edegan.com/blog/wp-admin.
The following plugin had to be disabled:
- Social Share WordPress Plugin - AccessPress Social Share
Finally, fix the permalink issue by setting
vi /etc/apache2/apache2.conf AllowOverride All
Then yay!
The whole thing needs updating, a new skin (or at least clean up), and some of the plugins don't work. But the basics are now up and running.
Updating
Go in to the wp-admin interface and hit update. It seemed to work fine!
Run the site-health.php tool: http://www.edegan.com/blog/wp-admin/site-health.php
apt-get install php7.2-gd apt-get install php7.2-bcmath Get and make imagemagick -- see https://www.tutorialspoint.com/articles/how-to-install-imagemagick-on-ubuntu magick -help apachectl restart
Also, update all of the plugins and remove the one inactive plugin that was causing problems earlier.
Other Web Server
For Google Analytics we linked the domain to dredegan@gmail.com on the Google Dashboard and added the key to LocalSettings.php. See http://edutechwiki.unige.ch/en/Mediawiki_installation#Google_Analytics
We also added write permissions to the images directory for www-data
chown -R www-data images/
Nvidia
The original intention was to install a GPU into the Dbase server, as GPU compute tasks wouldn't interfere (much) with the main operation of the server. The problem seems to be a combination of an unsigned Nvidia driver, Ubuntu 18.04, UEFI, and Secureboot (or not). See https://medium.com/@nolanmudge/installing-an-nvidia-graphics-driver-with-a-ubuntu-14-04-and-up-efi-boot-52725dd6927c
Regardless here are some useful commands:
See what drivers are being used
apt-get install ubuntu-drivers-common ubuntu-drivers devices cat /proc/driver/nvidia/version
See the display hardware config
sudo lshw -c display If shows *-display UNCLAIMED and no driver associated with it https://askubuntu.com/questions/762254/why-do-i-get-required-key-not-available-when-install-3rd-party-kernel-modules
Just try to work out what is going on:
ubuntu-drivers devices lsmod lshw -c display sudo lspci -vk ls -l /sys/firmware/efi/
Try installing CUDA and its driver: https://www.pugetsystems.com/labs/hpc/How-To-Install-CUDA-10-together-with-9-2-on-Ubuntu-18-04-with-support-for-NVIDIA-20XX-Turing-GPUs-1236/ And then: https://docs.nvidia.com/cuda/cuda-installation-guide-linux/index.html#post-installation-actions And may: https://xcat-docs.readthedocs.io/en/stable/advanced/gpu/nvidia/verify_cuda_install.html
Try installing the bundled cuda toolkit
apt-get install nvidia-cuda-toolkit apt-get install cuda-samples-7-0 -y cd /usr/local/cuda-7.0/samples make
Try installing the bundled nvidia driver
sudo apt install nvidia-driver-415
Purge nvidia drivers and add the experimental repo
apt-get purge nvidia* sudo add-apt-repository ppa:graphics-drivers
Get the latest driver from Nvidia and make it
wget http://us.download.nvidia.com/XFree86/Linux-x86_64/418.43/NVIDIA-Linux-x86_64-418.43.run apt-get install gcc apt-get install make sh NVIDIA-Linux-x86_64-418.43.run cat /var/log/nvidia-installer.log
View system logs:
journalctl -xb
Remove nouveau if being used (https://help.ubuntu.com/community/BinaryDriverHowto/Nvidia)
sudo ubuntu-drivers devices sudo apt-get --purge remove xserver-xorg-video-nouveau nvidia-xconfig
Add a secure boot key
sudo update-secureboot-policy --new-key sudo update-secureboot-policy --enroll-key
Disable/enable secureboot:
mokutil --disable-validation create 8-12 char password (same as ed's) Reboot and disable secureboot mokutil --enable-validation
Other
pdftk
I installed pdftk and configured it as follows:
snap install pdftk ln -s /snap/pdftk/current/usr/bin/pdftk /usr/bin/pdftk
Its man page (kinda) is here: https://www.pdflabs.com/docs/pdftk-man-page/
But generally you want to do use it combine files:
pdftk *.pdf cat output newfile.pdf pdftk a.pdf b.pdf cat output newfile.pdf
If you want to reduce a file that has large images in it, then the following sometimes works:
pdf2ps 1.pdf 1.ps ps2pdf -dPDFSETTINGS=/screen -dDownsampleColorImages=true -dColorImageResolution=144 -dColorImageDownsampleType=/Bicubic 1.ps 1.pdf
To do
- VNC!
- ImageMagick and uploads -- test
- Math extension later...
- Set up and configure Postfix mail server (https://www.digitalocean.com/community/tutorials/how-to-install-and-configure-postfix-on-ubuntu-18-04)
- Fix [[<haloacl-inclusion-denied>|Terms of Service]] on sign up page...
Mediawiki Redux
Mobile Front End
It seems that the Extension:MobileFrontend isn't working properly.
Does php have mbstring support?
First check the apache version:
apache2 -v Server version: Apache/2.4.29 (Ubuntu) Server built: 2018-10-10T18:59:25
And the php version:
php -v PHP 7.2.17-0ubuntu0.18.04.1 (cli) (built: Apr 18 2019 14:12:38) ( NTS ) Copyright (c) 1997-2018 The PHP Group Zend Engine v3.2.0, Copyright (c) 1998-2018 Zend Technologies with Zend OPcache v7.2.17-0ubuntu0.18.04.1, Copyright (c) 1999-2018, by Zend Technologies
Try just installing it:
apt-get install php7.2-mbstring
This failed because of an issue with dpkg. I rebooted and then:
dpkg --configure -a apt --fix-broken install
Then it installed but threw a notice about a modified configuration file. The differences were too large to show. I kept the old file (/etc/php/7.2/apache2/php.ini) for comparison to the new one (/usr/lib/php/7.2/php.ini-production).
In the old file, uncomment the mbstring extension and exif (after it) too, then reload the config and check it worked:
service apache2 reload php -i | grep mbstring
So everything seems fine (the extension it is listed as loaded in Special:Version), but the interface still has obvious issues.
Check the skin
From Special:Version
MediaWiki 1.26.2 PHP 7.2.24-0ubuntu0.18.04.2 (apache2handler) MySQL 5.7.25-0ubuntu0.18.04.2 Lua 5.1.5
Vector is the only installed skin.
And it looks like my version of mediawiki is too old to support Minerva Neue
Upgrade mediawiki
Essentially, follow instructions in Manual:Upgrading:
Backup
I just moved everything to a different directory, backed off the dbase, and started again.
cd /var/lib mv mediawiki mediawikibackup26082020 mysqldump --user=root --password=password > dbase.sql
New Install
Get a new version, put it in /var/lib/mediawiki (leaving the old shortcuts pointing there), then copy in the files.
wget https://releases.wikimedia.org/mediawiki/1.34/mediawiki-1.34.2.tar.gz tar -xvzf mediawiki-1.34.2.tar.gz mv mediawiki-1.34.2 mediawiki rm mediawiki-1.34.2.tar.gz cp mediawiki26082020/LocalSettings.php mediawiki/LocalSettings.php cp -a mediawiki26082020/images/ mediawiki/ Note: you don't need to change permissions because we used -a. This also copied the .htaccess file. Everything should be ok... cp /var/lib/mediawiki26082020/resources/assets/EdEganDotCotWikiGreenTab.png /var/lib/mediawiki/resources/assets/ Note: that's my wiki logo cp /var/lib/mediawiki26082020/favicon.ico /var/lib/mediawiki/favicon.ico Note: that the favicon! cd /var/lib/mediawiki/maintenance/ php update.php
Extensions
Retrieve and include extensions
Get the bulk of them...
mkdir installs cd installs wget https://extdist.wmflabs.org/dist/extensions/LabeledSectionTransclusion-REL1_34-4aa6bfa.tar.gz tar -xzf LabeledSectionTransclusion-REL1_34-4aa6bfa.tar.gz -C /var/lib/mediawiki/extensions wget https://extdist.wmflabs.org/dist/extensions/MobileFrontend-REL1_34-6a8ef84.tar.gz tar -xzf MobileFrontend-REL1_34-6a8ef84.tar.gz -C /var/lib/mediawiki/extensions wget https://extdist.wmflabs.org/dist/extensions/ImportUsers-REL1_34-2f1a670.tar.gz tar -xzf ImportUsers-REL1_34-2f1a670.tar.gz -C /var/lib/mediawiki/extensions wget https://extdist.wmflabs.org/dist/extensions/Scribunto-REL1_34-f7bc2e3.tar.gz tar -xzf Scribunto-REL1_34-f7bc2e3.tar.gz -C /var/lib/mediawiki/extensions wget https://extdist.wmflabs.org/dist/extensions/googleAnalytics-REL1_34-6441403.tar.gz tar -xzf googleAnalytics-REL1_34-6441403.tar.gz -C /var/lib/mediawiki/extensions wget https://extdist.wmflabs.org/dist/extensions/MultiUpload-REL1_34-e018c1d.tar.gz tar -xzf MultiUpload-REL1_34-e018c1d.tar.gz -C /var/lib/mediawiki/extensions wget https://extdist.wmflabs.org/dist/extensions/UserMerge-REL1_34-3517022.tar.gz tar -xzf UserMerge-REL1_34-3517022.tar.gz -C /var/lib/mediawiki/extensions wget https://extdist.wmflabs.org/dist/skins/MinervaNeue-REL1_34-ba11b7b.tar.gz tar -xzf MinervaNeue-REL1_34-ba11b7b.tar.gz -C /var/lib/mediawiki/skins wget https://extdist.wmflabs.org/dist/extensions/ConfirmAccount-REL1_34-3ffa446.tar.gz tar -xzf ConfirmAccount-REL1_34-3ffa446.tar.gz -C /var/lib/mediawiki/extensions wget https://extdist.wmflabs.org/dist/extensions/MassEditRegex-REL1_34-d3570f1.tar.gz tar -xzf MassEditRegex-REL1_34-d3570f1.tar.gz -C /var/lib/mediawiki/extensions wget https://extdist.wmflabs.org/dist/extensions/DataTransfer-REL1_34-1fc1c61.tar.gz tar -xzf DataTransfer-REL1_34-1fc1c61.tar.gz -C /var/lib/mediawiki/extensions wget https://extdist.wmflabs.org/dist/extensions/HTMLets-REL1_34-a8227c3.tar.gz tar -xzf HTMLets-REL1_34-a8227c3.tar.gz -C /var/lib/mediawiki/extensions wget https://extdist.wmflabs.org/dist/skins/MinervaNeue-REL1_34-ba11b7b.tar.gz tar -xzf MinervaNeue-REL1_34-ba11b7b.tar.gz -C /var/lib/mediawiki/skins wget https://extdist.wmflabs.org/dist/extensions/GeoData-REL1_34-8a52fa4.tar.gz tar -xzf GeoData-REL1_34-8a52fa4.tar.gz -C /var/lib/mediawiki/extensions
wget https://extdist.wmflabs.org/dist/extensions/NumberFormat-REL1_34-cf8a23e.tar.gz tar -xzf NumberFormat-REL1_34-cf8a23e.tar.gz -C /var/lib/mediawiki/extensions
Do the semantic mediawiki install and config. Last time, we installed Semantic Mediawiki using composer, which is the preferred method. See Web_Server_Documentation.
cd installs php -r "copy('https://getcomposer.org/installer', 'composer-setup.php');" php -r "if (hash_file('sha384', 'composer-setup.php') === '8a6138e2a05a8c28539c9f0fb361159823655d7ad2deecb371b04a83966c61223adc522b0189079e3e9e277cd72b8897') { echo 'Installer verified'; } else { echo 'Installer corrupt'; unlink('composer-setup.php'); } echo PHP_EOL;" php composer-setup.php --install-dir=/usr/local/bin --filename=composer #php -r "unlink('composer-setup.php');" vi composer.local.json { "require": { "mediawiki/semantic-media-wiki": "~3.1", "mediawiki/semantic-result-formats": "~3.1" } } composer update --no-dev Add line to LocalSettings.php enableSemantics('192.168.2.92'); php maintenance/update.php cd ../extensions/ #https://www.mediawiki.org/wiki/Extension:Page_Forms git clone https://gerrit.wikimedia.org/r/mediawiki/extensions/PageForms.git add to LocalSettings.php
ConfirmAccount Bug
Installing ConfirmAccount gave an error whenever there was an account request.
Warning: file_exists(): Unable to find the wrapper "mwstore" - did you forget to enable it when you configured PHP? in /../w/extensions/ConfirmAccount/backend/ConfirmAccount.class.php on line 29
This seemed to be a registered bug - see https://phabricator.wikimedia.org/T219859. It appears to happen as a consequence of the file_exists call, and is just a warning. It doesn't affect functionality. So I put an @ in front of the file_exists call, which is poor practice but it worked just fine.
Line 29: if ( $path && @file_exists( $path ) ) {
Other Config
Check pre-reqs are good for Scribuntu.
php -r 'echo "pcre: " . ( extension_loaded( "pcre" ) ? PCRE_VERSION : "no" ) . "\n";' php -r 'echo "mbstring: " . ( extension_loaded( "mbstring" ) ? "yes" : "no" ) . "\n";'
Fix some permissions:
chmod a+x extensions/Scribunto/includes/engines/LuaStandalone/binaries/lua5_1_5_linux_64_generic/lua chmod a+x extensions/SyntaxHighlight_GeSHi/pygments/pygmentize
Tune up php and Apache2:
vi /etc/php/7.2/apache2/php.ini change upload_max_filesize to 4M vi /etc/apache2/apache2.conf <Directory /var/www/wiki/images> Options -Indexes </Directory> service apache2 restart
MathML
I tried various methods to get MathML to work and always failed. It looks like the community bet on Mathoid working out, but there's been no development on it for 5 months now and it looks dead. The good news is that MathJax works just fine right out of the box:
git clone https://github.com/jmnote/SimpleMathJax.git wfLoadExtension( 'SimpleMathJax' ); #$wgSmjInlineMath = [ [ "$", "$" ], [ "\\(", "\\)" ] ]; Note: the last line lets you demark math with LaTeX-like syntax. I disabled it, as I use $ signs way too much in other contexts.
PDFEmbed
PDFEmbed was the extension that I never knew that I always wanted!
git clone https://gitlab.com/HydraWiki/extensions/PDFEmbed.git wfLoadExtension( 'PDFEmbed' );
SemanticACL
IntraACL (http://wiki.4intra.net/IntraACL) doesn't work with Mediawiki 1.34, so it's out. I also wasn't wild about its patch-based approach.
This time around I went with SemanticACL for access control. It's in beta but its actively maintained and its approach is simple, sane, and harnesses the power of what's already there. No hooks, no patches, no blah... just an extension that does what you want out of the box.
The only other real contender was Extension:AccessControl, which is stable. However, its approach just isn't as clean, and the author's request for funds to translate his documentation from Czech to English didn't endear me.
Useful Links for SemanticACL:
- https://www.mediawiki.org/wiki/Manual:User_rights#List_of_groups
- https://www.mediawiki.org/wiki/Category:Page_specific_user_rights_extensions
- https://www.mediawiki.org/wiki/Extension:Semantic_ACL
- https://www.mediawiki.org/wiki/Security_issues_with_authorization_extensions
wget https://extdist.wmflabs.org/dist/extensions/SemanticACL-REL1_34-01ae8be.tar.gz tar -xzf SemanticACL-REL1_34-01ae8be.tar.gz -C /var/lib/mediawiki/extensions
In LocalSettings.php (for m1.34):
require_once "$IP/extensions/SemanticACL/SemanticACL.php";
To configure security on a page (https://www.mediawiki.org/wiki/Extension:Semantic_ACL#Example)
[[Visible to::whitelist]] [[Visible to group::team]] [[Editable by::whitelist]] [[Editable by user::User:whoever]]
or equivalently (but silently):
{{#set: Visible to=whitelist|Visible to group=team}}
BibTeX
The BibTeX extension doesn't work anymore. Though it never really worked, so it's not much of a loss. I should probably build a replacement but I don't have the time right now.
Upload Multiple Files
Although the Upload multiple files extension installed fine, it is unmaintained and seems to have an issue. I removed its line from LocalSettings.php and deleted its extension directory.
I then installed Simple Batch Upload using a tarball:
in mediawiki/installs wget https://github.com/ProfessionalWiki/SimpleBatchUpload/archive/1.6.0.tar.gz tar -xzf 1.6.0.tar.gz -C /var/lib/mediawiki/extensions mv /var/lib/mediawiki/extensions/SimpleBatchUpload-1.6.0 /var/lib/mediawiki/extensions/SimpleBatchUpload
In LocalSettings.php:
wfLoadExtension( 'SimpleBatchUpload' ); $wgSimpleBatchUploadMaxFilesPerBatch = ['*' => 10,];
I had previously added Special:MultiUpload|Upload multiple files to http://www.edegan.com/wiki/MediaWiki:Sidebar. I replaced it with a link to Special:BatchUpload.
Allow SVG images
See https://www.mediawiki.org/wiki/Manual:Image_administration#SVG. Essentially, add svg to $wgFileExtensions, then install and designate an image converter. I went with rsvg:
apt-get install librsvg2-bin vi LocalSettings.php $wgSVGConverter = 'rsvg';
Add HitCounters
wget https://extdist.wmflabs.org/dist/extensions/HitCounters-REL1_34-48dd6cb.tar.gz tar -xzf HitCounters-REL1_34-48dd6cb.tar.gz -C /var/lib/mediawiki/extensions vi ../LocalSettings.php wfLoadExtension( 'HitCounters' ); cd ../maintenance php update.php
Change the Dbase
I tried to 'rename' the dbase, creating a dedicated dbase user that has access to just the wiki's dbase, and resetting its password.
in /bulk/backups:
mysqldump --password olddbase > mediawiki.sql mysql -u olduser -p CREATE DATABASE wiki; mysql -u olduser -p wiki < mediawiki.sql mysql -u olduser -p CREATE USER 'wiki'@'localhost' IDENTIFIED BY 'password'; GRANT ALL PRIVILEGES ON wiki.* TO 'wiki'@'localhost'; FLUSH PRIVILEGES;
This led to two different errors. First, the cloned database didn't seem to support Semantic Mediawiki somehow. And second, the new user didn't seem to work. This was true even if I gave them rights on the olddbase. So I abandoned the attempt.
Clean up the attempt:
mysql -u olduser -p DROP DATABASE wiki; DROP USER 'wiki'@'localhost';
I did put an .htaccess file in /var/lib/mediawiki to restrict access to LocalSettings.php, though I expect that this is redundant.
<files LocalSettings.php> order allow,deny deny from all </files>
Setting up for advanced template(s) import
Robelbox
Importing the Robelbox, or other, templates on mediawiki is tricky [1], at least the first time that you do it. Once you've got everything up and running to support templates (see above):
- Find the template on another mediawiki installation
- Go to Special:Export on that installation and export the template (but not its revision history)
- Import the template on the wiki using Special:Import, set the interwiki prefix to something that designates the source, like en for English Wikipedias.
I did this for the Robelbox template, which I got from https://en.wikiversity.org/wiki/Special:Export, however, it wasn't useable and I couldn't work out why. I ultimately deleted Robelbox, having found better boxes (see below) but I expect that my process for fixing the later issues would have sorted out the problems here too.
Fixing Template Issues
I got Template:Box-round from mediawiki.org: https://www.mediawiki.org/wiki/Template:Box-round. It required installation of TemplateStyles, which in turn might need JsonConfig:
In mediawiki/installs
wget https://extdist.wmflabs.org/dist/extensions/TemplateStyles-REL1_34-c4d6f25.tar.gz tar -xzf TemplateStyles-REL1_34-c4d6f25.tar.gz -C /var/lib/mediawiki/extensions wget https://extdist.wmflabs.org/dist/extensions/JsonConfig-REL1_34-f877d87.tar.gz tar -xzf JsonConfig-REL1_34-f877d87.tar.gz -C /var/lib/mediawiki/extensions
Add to LocalSettings.php
wfLoadExtension( 'TemplateStyles' ); wfLoadExtension( 'JsonConfig' );
Now Template:Box-round works fine but, like Template:Tl, has an error message on its page about JsonConfig being missing. (Note that Template:T1 previously said Module:TNT was missing, so I got it from www.mediawiki/w/Module:TNT). JsonConfig seems fine and shows in Special:Version.
I added:
$wgJsonConfigEnableLuaSupport = true;
which gave:
Lua error: bad argument #1 to "get" (not a valid title).
So instead, I put all the following into LocalSettings.php to configure JsonConfig (see [2]):
// Safety: before extension.json, these values were initialized by JsonConfig.php if ( !isset( $wgJsonConfigModels ) ) { $wgJsonConfigModels = []; } if ( !isset( $wgJsonConfigs ) ) { $wgJsonConfigs = []; } $wgJsonConfigEnableLuaSupport = true; // https://www.mediawiki.org/wiki/Extension:JsonConfig#Configuration $wgJsonConfigModels['Tabular.JsonConfig'] = 'JsonConfig\JCTabularContent'; $wgJsonConfigs['Tabular.JsonConfig'] = [ 'namespace' => 486, 'nsName' => 'Data', // page name must end in ".tab", and contain at least one symbol 'pattern' => '/.\.tab$/', 'license' => 'CC0-1.0', 'isLocal' => false, ]; // Enable Tabular data namespace on Commons - T148745 $wgJsonConfigInterwikiPrefix = 'commons'; $wgJsonConfigs['Tabular.JsonConfig']['remote'] = [ 'url' => 'https://commons.wikimedia.org/w/api.php' ];
Then I copied over Module:Documentation/styles.css from mediawiki.org, commenting out the background image in line 168. And everything seems to work fine...
I got the following templates from mediawiki (via mediawiki's Special:Export to get dependencies):
- Template:Colored box
- Template:Navbox
- Template:Help box
- Template:Side box
- Template:Note
Note that this overwrote Template:Tl, Template:TNT, and others that I resolved issues with previously.
Now a new set of issues has emerged. These include JsonConfig problems (again), and template loops (which I think are coming from Module:Template translation), and missing dependencies (e.g, Template:Mbox and Module:Color contrast). Moreover, most of the templates render the if and other conditional logic statements, rather then executing them. I installed ParserFunctions (which I should have done before) and it solved everything outstanding!
In mediawiki/installs
wget https://extdist.wmflabs.org/dist/extensions/ParserFunctions-REL1_34-4de6f30.tar.gz tar -xzf ParserFunctions-REL1_34-4de6f30.tar.gz -C /var/lib/mediawiki/extensions
Add to LocalSettings.php
wfLoadExtension( 'ParserFunctions' ); $wgPFEnableStringFunctions = true;
The Front Page
To do the front page, I copied the source of Template:Main page from mediawiki to a page (called Test) and created [Template:Main page/styles.css] using mediawiki's code. Then I rejigged the contents of the page!
The only minor but non-obvious change, was that I used h2 headings inside each mainpage_box, rather than h3's. As a consequence, I needed to add the following to Template:Main_page/styles.css:
.mainpage_box h2 { border-bottom: none; } .mainpage_box h2 .header_icon { margin-right: 5px; }
Old instructions[3] suggest using Special:ExpandTemplate on mediawiki's wiki, with the input text {{:MediaWiki}}. But this isn't necessary as the template doesn't need expanding in its current (at the time of writing) incarnation. Naturally, the page works well on MobileFrontend.
Mass Edit
The Mass edit page contains several examples, two of which show "Undefined Control Sequence" errors. These examples use backslashed square brackets (I can't even write them on the wiki using nowiki tags), which have issues because of Extension:SimpleMathjax. Regardless, the extension seems to work just fine!
Update Linux
Get the system up to date:
apt-get update apt-get upgrade
During the upgrade I chose:
- keep the local smb.conf
- keep the local grub (new version in /tmp/grub.l1gqsHmubw)
There were dependcy problem and other warnings during the process. It finished with:
Errors were encountered while processing: keyboard-configuration xserver-xorg-core xserver-xorg-input-wacom console-setup-linux console-setup ubuntu-minimal E: Sub-process /usr/bin/dpkg returned an error code (1) W: Operation was interrupted before it could finish
I rebooted the server. It came up ok and everything seemed fine (I'm doing this over terminal), but it is still claiming that there are updates. The issue might be related to a known Ubuntu bug: https://bugs.launchpad.net/ubuntu/+source/console-setup/+bug/1770482. The solution(s) might be [4]:
Look at the keyboard config file:
cat /etc/default/keyboard # KEYBOARD CONFIGURATION FILE # Consult the keyboard(5) manual page. XKBMODEL="pc105" XKBLAYOUT="us" XKBVARIANT="" XKBOPTIONS=""
Remove and reinstall the keyboard-configuration:
apt-get remove keyboard-configuration apt-get install keyboard-configuration Note that this launches a screen where the only options are Afghani variants... so cancel it! DEBIAN_FRONTEND=noninteractive apt-get install keyboard-configuration This ran and changed the keyboard layout file to the default: # KEYBOARD CONFIGURATION FILE # Consult the keyboard(5) manual page. XKBMODEL="pc105" XKBLAYOUT="us,af" XKBVARIANT="," XKBOPTIONS="grp_led:scroll" vi /etc/default/keyboard I manually removed the af option and the commas, essentially reverting the file (I like a scroll light). cd /usr/share/X11/xkb/symbols ln -s us en This was the other solution offered. I was missing an en option, so that might be it. apt-get install keyboard-configuration Now it says that there is nothing to do, which is promising.
Try the update again:
apt-get update && apt-get upgrade Nothing happened and everything seems fine... shutdown -r now
On boot, the box reports:
Welcome to Ubuntu 18.04.5 LTS (GNU/Linux 4.15.0-45-generic x86_64) ... 36 packages can be updated. 30 updates are security updates.
Running apt-get upgrade gives:
The following packages have been kept back: fwupd fwupdate fwupdate-signed libfwup1 libgl1-mesa-dri libreoffice-avmedia-backend-gstreamer libreoffice-base-core libreoffice-calc libreoffice-core libreoffice-draw libreoffice-gnome libreoffice-gtk3 libreoffice-impress libreoffice-math libreoffice-ogltrans libreoffice-writer libxatracker2 linux-generic linux-headers-generic linux-image-generic netplan.io python3-software-properties python3-uno software-properties-common software-properties-gtk 0 upgraded, 0 newly installed, 0 to remove and 25 not upgraded.
The latest LTS version (at the time of writing) is 20.04.1 (see https://wiki.ubuntu.com/Releases). So I could do an:
apt-get dist-upgrade
But I should really do a full backup and everything first, so that isn't going to happen today.
Wordpress
Overview
Rather than trying to update wordpress, I think it best to install the latest version and use the old dbase. This will likely cause problems with images... but we made several suboptimal choices when we built the last version, including using a non-standard theme and customizing it in a way that prevented updates.
There's a useful wordpress article on the basics of the approach, albeit from a hosted install perspective: https://www.wpbeginner.com/wp-tutorials/how-to-restore-a-wordpress-site-with-just-database-backup/
The main install instructions are: https://wordpress.org/support/article/how-to-install-wordpress/
For theme customization beyond that done in the interface, or through a plugin (like Code Snippets [5], wordpress says the best approach is to create a child theme [6].
Choosing the theme (https://wordpress.org/themes/ and https://wordpress.com/themes) is a major decision. This time I want a much more standard theme, that has better plugin and widget support, is responsive and gives a good mobile interface. I also don't want to pay but will have to trade that off against doing customization to make it look distinct.
It seems that Twenty Fifteen has the most active installs, but all the Twenty series, which are default themes made by wordpress are wildly popular. It might be worth using Twenty Twenty, as it is the most recent and takes advantage of the block editor (Twenty Nineteen does too but gets mediocre reviews), and I like the look of Twenty Fourteen.
Outside of the defaults, OceanWP is eCommerce oriented but looks good and is very popular. Neve sits between OceanWP and GeneratePress, which has a more magazine/news focus, and all three take advantage of the new block editor (Gutenberg), which was introduced in WPv5 (initial release in 2018).
Pre-install
Check PHP and MySQL. I need PHP >=7.3 and MySQL >=5.6 but:
php --version PHP 7.2.24-0ubuntu0.18.04.6 (cli) (built: May 26 2020 13:09:11) ( NTS ) mysql --version mysql Ver 14.14 Distrib 5.7.31, for Linux (x86_64) using EditLine wrapper
I might be able to upgrade my version of PhP without upgrading Ubuntu (see https://linuxize.com/post/how-to-install-php-on-ubuntu-18-04/). However, it is probably a good idea to just fix everything...
Upgrading Linux Distro
So it turns out that I shouldn't have done that last update... I do have an Xwindows Server on the box, running Gnome, and now I can't log in using the GUI on the box itself (it loops back to the login screen). This box doesn't contain the GPUs, just the database server, so the GUI isn't key, but it would be nice to have it working again. Hopefully, an upgrade will fix that, as well as other issues.
Backing off
First, mount the USB drive. Find what's mounted and what the dev is:
mount -t ext4 (or just mount for everything) ls -l /dev/disk/by-id/usb* (or fdisk -l or lsblk) mkdir -p /media/usb mount /dev/sda1 /media/usb
Back up the databases:
psql postgres /l As researcher and in /bulk/backups/ mv lbo_Fc.dump lbo_Fc.dump.org pg_dump -Fc allpatentsprocessed > allpatentsprocessed_Fc.dump pg_dump -Fc accelerators > accelerators_Fc.dump pg_dump -Fc grants > grants_Fc.dump pg_dump -Fc incubators > incubators_Fc.dump pg_dump -Fc lbo > lbo_Fc.dump pg_dump -Fc stockmarket > stockmarket_Fc.dump pg_dump -Fc crunchbase3 > crunchbase3_Fc.dump pg_dump -Fc vcdb20h1 > vcdb20h1_Fc.dump pg_dump -Fc vcdb4 > vcdb4_Fc.dump mysql -u root -p SHOW DATABASES; \q mysqldump --databases --password mcnair > mcnair.sql mysqldump --databases --password wordpress > wordpress.sql mysqldump --password mcnair > mediawiki.sql
Do the file transfers
mkdir /media/usb/mother-2020-08-09 mkdir /media/usb/mother-2020-08-09/bulk rsync -av --progress --exclude="mcnair" /bulk/ /media/usb/mother-2020-08-09/bulk/ mkdir /media/usb/mother-2020-08-09/html rsync -av --progress /var/www/html/ /media/usb/mother-2020-08-09/html/
Finally:
umount /media/usb
Do the upgrade
Run:
apt update apt upgrade apt dist-upgrade apt autoremove
do-release-upgrade if no release found because you are too early, add the -d to allow development (it will still install LTS if that's available) do-release-upgrade -d
This failed on the first attempt. So I did:
grep ERROR /var/log/dist-upgrade/main.log grep BROKEN /var/log/dist-upgrade/apt.log apt-get remove postgresql-10-postgis-2.4
Then:
do-release-upgrade -d I selected some choices (keep smd.conf, don't notify me of whatever, etc.) I let it replace postgres10 but it still gave me an "Obsolete Major Version" warning on postgres (I said ok).
Address the upgrade issues
The first casualty of the upgrade was the networking configuration. You'd think that developers would have figured that one out, as remote upgrades would leave boxes DOA until someone could get physical access. Nevertheless, the fix is straight forward.
The old ifup and down and eth0 etc. interface system is gone now, taking its config with it. To get the networking back:
ifconfig Outdated now, I think, but it still shows what's up... ip -a This will get you the names of the interfaces etc. I already had a .yaml under a different interface name that set up DHCP, so I used it as a template for the interface that I wanted up that way cp /etc/network/01-netcfg.yaml /etc/network/99_config.yaml vi /etc/network/99_config.yaml change the interface name to eno0 netplan apply
Now everything looks good for a foundation - apache2 is working, SSH is working, but I need to do a minor config fix for the wiki.
apt-get install php-mbstring apachectl restart
And the wiki comes back up but with a error notice. The issue seems to be with PHP 7.4, and it looks like it affects both mediawiki and wordpress, though wordpress might have fixed it. Regardless, it is possible to install 7.3 as well, as use that with apache2.
add-apt-repository ppa:ondrej/php apt-get update apt-get install php7.3 apt-get install php7.3-cli php7.3-common php7.3-json php7.3-opcache php7.3-mysql php7.3-mbstring php7.3-zip php7.3-fpm php7.3-intl php7.3-simplexml Note we may need to fix some config again as it said: Creating config file /etc/php/7.3/apache2/php.ini a2dismod php7.4 a2enmod php7.3 I ignored the following notices for now: NOTICE: To enable PHP 7.3 FPM in Apache2 do: NOTICE: a2enmod proxy_fcgi setenvif NOTICE: a2enconf php7.3-fpm systemctl restart apache2 update-alternatives --set php /usr/bin/php7.3 update-alternatives --set phar /usr/bin/phar7.3 update-alternatives --set phar.phar /usr/bin/phar.phar7.3 update-alternatives --set phpize /usr/bin/phpize7.3 update-alternatives --set php-config /usr/bin/php-config7.3 error: no alternatives for php-config (ignored for now)
And the wiki now seems happy!
PostGIS Issues
I also checked postgres and everything seemed ok:
su researcher psql vcdb4 \l They are all there \dx All my extensions report back.
Update: It seems something did go wrong. Just because the extensions report back doesn't mean they work! When I try to run queries that use PostGIS, I get:
SQL Error [58P01]: ERROR: could not access file "$libdir/postgis-2.4": No such file or directory
I tried updating the extension (I'm pretty sure that I'm running 2.4.3):
ALTER EXTENSION postgis UPDATE TO "2.4.3";
But that didn't fix anything. I checked the versions:
select version(); PostgreSQL 10.14 (Ubuntu 10.14-0ubuntu0.18.04.1) on x86_64-pc-linux-gnu, compiled by gcc (Ubuntu 7.5.0-3ubuntu1~18.04) 7.5.0, 64-bit --So somehow I'm still running version 10! select PostGIS_full_version(); ERROR: could not access file "$libdir/postgis-2.4": No such file or directory CONTEXT: SQL statement "SELECT public.postgis_lib_version()" PL/pgSQL function postgis_full_version() line 25 at SQL statement
So I backed off the data from the two dbases that I'd used since the upgrade:
pg_dump -Fc stockmarket > stockmarket_Fc_20201023.dump pg_dump -Fc vcdb4 > vcdb4_Fc_20201023.dump #FAILED!
The second backoff failed:
pg_dump: [archiver (db)] query failed: ERROR: could not access file "$libdir/postgis-2.4": No such file or directory pg_dump: [archiver (db)] query was: SELECT a.attnum, a.attname, a.atttypmod, a.attstattarget, a.attstorage, t.typstorage, a.attnotnull, a.atthasdef, a.attisdropped, a.attlen, a.attalign, a.attislocal, pg_catalog.format_type(t.oid,a.atttypmod) AS atttypname, array_to_string(a.attoptions, ', ') AS attoptions, CASE WHEN a.attcollation <> t.typcollation THEN a.attcollation ELSE 0 END AS attcollation, a.attidentity, pg_catalog.array_to_string(ARRAY(SELECT pg_catalog.quote_ident(option_name) || ' ' || pg_catalog.quote_literal(option_value) FROM pg_catalog.pg_options_to_table(attfdwoptions) ORDER BY option_name), E', ') AS attfdwoptions FROM pg_catalog.pg_attribute a LEFT JOIN pg_catalog.pg_type t ON a.atttypid = t.oid WHERE a.attrelid = '19998614'::pg_catalog.oid AND a.attnum > 0::pg_catalog.int2 ORDER BY a.attnum
Postgres Upgrade Attempt (Failed)
My changes weren't substantial, so I proceeded with an upgrade. First I checked to see if I had postgres12 installed and listening on another port or not:
locate postgres ls /usr/bin/postgres dpkg --get-selections | grep postgres pg_lsclusters Ver Cluster Port Status Owner Data directory Log file 10 main 5432 online postgres /data/postgres /var/log/postgresql/postgresql-10-main.log 12 main 5433 online postgres /var/lib/postgresql/12/main /var/log/postgresql/postgresql-12-main.log pg_upgradecluster 10 main #This failed: pg_dump: error: query failed: ERROR: could not access file "$libdir/postgis-2.4": No such file or directory
So... I can't automatically upgrade without first fixing the issue with v10 and postgis.
add-apt-repository http://apt.postgresql.org/pub/repos/apt But that put the following into /etc/apt/sources-list: deb http://apt.postgresql.org/pub/repos/apt focal main vi it to (see https://wiki.postgresql.org/wiki/Apt): deb http://apt.postgresql.org/pub/repos/apt focal-pgdg wget --quiet -O - http://apt.postgresql.org/pub/repos/apt/ACCC4CF8.asc | sudo apt-key add - apt-get update #Throws a warning: N: Skipping acquire of configured file 'main/binary-i386/Packages' as repository 'http://apt.postgresql.org/pub/repos/apt focal-pgdg InRelease' doesn't support architecture 'i386'
But the wretched thing still doesn't seem to be available:
apt-get install postgresql-10-postgis-2.4 Package postgresql-10-postgis-2.4 is not available, but is referred to by another package.
Trying a manual approach. Get the file, put it in /bulk/temp and cd there. Then:
apt-get install ./postgresql-10-postgis-2.4_2.4.3+dfsg-4_i386.deb #This failed too - there are unmet dependencies and they are 'not installable'.
Switching over the installations
So, I took the alternative approach of changing the data folders [7].
The plan:
- Take version 10 offline
- Move version 10's data to a new location (/var/lib/postgresql/10/main)
- Switch the ports of versions 10 and 12
- Move version 12's data to /data
- Put version 12 online
- Load up the data in version 12!
- Optionally wipe out the old installation
Shut it down:
pg_ctlcluster 12 master start pg_lsclusters #The cluster Ver Cluster Port Status Owner Data directory Log file 10 main 5432 online postgres /data/postgres /var/log/postgresql/postgresql-10-main.log 12 master 5433 online postgres /var/lib/postgresql/12/master /var/log/postgresql/postgresql-12-master.logg systemctl stop postgresql systemctl status postgresql
Edit the config files:
vi /etc/postgresql/10/main/postgresql.conf data_directory = '/var/lib/postgresql/10/main' port = 5433 vi /etc/postgresql/12/master/postgresql.conf data_directory = '/data/postgres' port = 5432 listen_addresses = '*' #While we are here do some performance tuning: shared_buffers = 512MB huge_pages = try temp_buffers = 8G work_mem = 4GB maintenance_work_mem = 64 effective_cache_size = 384GB #Note that I didn't reduce the number of connections (and the max_wal_senders, which must be < max connections), or change max_stack_depth (which gives an error if you set it too high) vi /etc/postgresql/12/master/pg_hba.conf Copy over the config to allow access from inside the network
Move the data:
df #to check diskspace rm -R /var/lib/postgresql/10/main #Note that none of the config files in here were valid (though you should check this is true before you do it!) rsync -av /data/postgres/ /var/lib/postgresql/10/main #Takes awhile, but make sure it is all done before the next step rm -R /data/postgres rsync -av /var/lib/postgresql/12/master/ /data/postgres systemctl start postgresql pg_lsclusters Ver Cluster Port Status Owner Data directory Log file 10 main 5433 online postgres /var/lib/postgresql/10/main /var/log/postgresql/postgresql-10-main.log 12 master 5432 online postgres /data/postgres /var/log/postgresql/postgresql-12-master.log
Do the installs for some extensions:
apt-get install postgresql-12-plr apt-get install postgresql-plperl-12 postgresql-plpython3-12
Check it all works:
psql postgres CREATE EXTENSION postgis; CREATE EXTENSION plr; CREATE EXTENSION plperl; CREATE EXTENSION plpython3u; \dx List of installed extensions Name | Version | Schema | Description
+---------+------------+---------------------------------------------------------------------
plpgsql | 1.0 | pg_catalog | PL/pgSQL procedural language postgis | 3.0.0 | public | PostGIS geometry, geography, and raster spatial types and functions (2 rows)
Make the user:
createuser --interactive researcher
Then restore the databases (as researcher in /bulk/backup):
createdb stockmarket pg_restore -Fc -d stockmarket stockmarket_Fc_20201023.dump createdb vcdb4 pg_restore -Fc -d vcdb4 vcdb4_Fc.dump
The restore threw some errors related to not having extension plpythonu, but otherwise seemed fine. The issue seems to be that pythonu is python2[8], and python2 is not available for postgres 12 (it might be here: https://wiki.postgresql.org/wiki/Apt):
apt-cache search ".*plpython*.*"
Other Fixes
Remove redundant user accounts:
cat /etc/passwd userdel -r username
I need to get Xwindows set up again. My best guess as to the cause of this issue is leftover Nvidia drivers from my attempts to install the GPUs on this box went bad in an earlier apt-get upgrade but I can't see them listed:
dpkg -l | grep nvidia-driver
There is a .Xauthority file, and an .ICEauthority file, in /home/ed and both are owned by ed:ed. The former is empty (0 bytes) and the latter has some non-UTF8 (I think?) characters in it. I'm not sure if either is an issue.
I didn't see xserver-xorg-video-nouvea in the package list or any video driver module, so I installed nouveau:
dpkg -l lsmod | more apt install xserver-xorg-video-nouveau I'm not sure if I should be fixing my boot image or not... shutdown -r now lsmod | more
After doing this the login would give a local desktop but neither the keyboard nor mouse worked. I tried uninstalling and reinstalled the keyboard-configuration again.
apt-get remove keyboard-configuration apt-get install keyboard-configuration shutdown -r now
But that just put me back where I was: with a login loop problem. So I tried switching to lightdm:
apt-get install lightdm
And it worked even before a reboot. After a reboot, I had a different login screen but the actual desktop looked the same. The .Xauthority file is now 51 bytes big and I suddenly have a .xsession-errors, which contains a list of environment settings taking place... However, the machine then silently crashed that night and again the following morning. I couldn't find a specific cause in the logs but there did seem to be a number X and GNOME problems:
journalctl -b -1 journalctl --since "1 hour ago"
I ran an update from the GUI, which may have helped. However, there was a warning about an issue with a screensaver the first time that I loaded lightdm, and the crashes seemed to happen sometime after a clean boot. So I uninstalled lightdm, and installed gdm (which failed as installed already) and rebooted but got no GUI. Then I uninstalled and reinstalled gdm and everything seems fine now.
apt-get remove lightdm apt-get remove gdm3 apt-get install gdm3
Incidentally, I left a clock running in a terminal so that I could see when the box went down if it crashed again. The clock code is:
while [ 1 ] ; do echo -en "$(date +%T)\r" ; sleep 1; done
Important Moves
I kept the old versions of mediawiki and wordpress and moved them to /bulk/retired (using yyymmdd dates)
mv /var/lib/mediawiki26082020 /bulk/retired/ mv /bulk/retired/mediawiki26082020 /bulk/retired/mediawiki20200826 mv /var/www/html/blog20200809 /bulk/retired/
Wordpress Redux
Install
First, move the old folder to a new name, so that it is there for backup and then get the new install and unpack it.
cd /bulk/installs wget https://wordpress.org/latest.tar.gz mv /var/www/html/blog /var/www/html/blog20200809 tar -xzf latest.tar.gz -C /var/www/html/ cd /var/www/html/ mv wordpress/ blog/ chown -R www-data:www-data blog
Put an .htaccess file in that folder to restrict access while we work:
vi blog/.htaccess <RequireAny> Require ip 192.168.2.1 </RequireAny>
Set up
Then set up the dbase by editing wp-config.php (it's easiest to modify the sample).
cp blog/wp-config-sample.php blog/wp-config.php vi blog/wp-config.php Note get some keys from: https://api.wordpress.org/secret-key/1.1/salt/
Then the backend works - go to http://www.edegan.com/blog/wp-admin! However the health check shows a missing required module and two missing recommended modules. Fix that:
apt-get install php7.3-gd apt-get install php7.3-curl apt-get install php7.3-imagick apachectl restart
Ironically, it then recommends that I upgrade to PHP7.4... but that would just give issues for mediawiki. On the other hand, everything is now green and just 4 groups of recommendations remain.
Config
See Wordpress Blog Site (Tool) for the McNair Center's build.
Using www.edegan.com/blog/wp-admin I configured the blog as follows:
- Select Twenty Twenty as the theme
- Add the permalink code to the .htaccess file, so that the URLs will work with postnames
- Copy over images to wp-content/uploads (use cp -a to maintain permissions)
- Change the site name to https (after fixing the https setup, see below)
Install plugins:
- Yoast SEO
- Wordfence Security
- Disable Comments
- Site Kit by Google (set up once live!)
- Pixabay
I also added:
- CoBlocks (free)
- Advanced Gutenberg (free)
- Otter
I didn't add Co-Authors Plus (https://wordpress.org/plugins/co-authors-plus/) as it hasn't been tested on the latest version of wordpress. There are other plugins that offer equivalent functionality if I need one later.
Other plugins I might want are:
- Revive Old Post (share with twitter)
- Optimole (optimize images)
- WP Rocket (implement cache)
Notes:
- Twitter embedding: https://www.wpbeginner.com/wp-tutorials/how-to-display-recent-tweets-in-wordpress-with-twitter-widgets/
Hardening Wordpress
I hardened the wordpress installation: https://wordpress.org/support/article/hardening-wordpress/
This included:
- Fixing file ownership: For fully hardened, change ownership of everything to root, except wflogs, uploads and themes in wp-content, which should be owned by www-data. However, then you won't be able to install plugins etc. A compromise is -R root:root for blog and then www-data:www-data for wp-content.
- Check file permissions: Everything is 644, except wp-content which is 755
- Checking dbase rights and setting new passwords.
- Changing passwords on old accounts (with posts, so the accounts shouldn't be deleted) to random strong strings.
- Fixing up .htaccess file to impose restrictions
- Install Sucuri
- Enable more logging
Checking user rights in the dbase and changing their password:
mysql -user=root -p use wordpress SELECT User FROM mysql.user; SHOW GRANTS FOR 'username'@'localhost'; SET PASSWORD FOR 'username'@'localhost'='newpassword'; (Note that this shouldn't be logged in clear on the server, but might be on a client. Delete .mysql_history at the end of your session.)
.htaccess in wp-includes:
# Block the include-only files. <IfModule mod_rewrite.c> RewriteEngine On RewriteBase / RewriteRule ^wp-admin/includes/ - [F,L] RewriteRule !^wp-includes/ - [S=3] RewriteRule ^wp-includes/[^/]+\.php$ - [F,L] RewriteRule ^wp-includes/js/tinymce/langs/.+\.php - [F,L] RewriteRule ^wp-includes/theme-compat/ - [F,L] </IfModule> # BEGIN WordPress
The #BEGIN WordPress tag is redundant as the file is 644 root.
Add the following to .htaccess in the wordpress dir:
<files wp-config.php> order allow,deny deny from all </files>
If there are plugin installation issues then add to wp-config.php
define('FS_METHOD','direct');
Once I'm all done with the theme etc., I can uncomment the following from wp-config.php
define('DISALLOW_FILE_EDIT', true);
Redesign
I built a Branding palette, to standardize the colors. And I installed the Twentig plugin, to give extra configuration options.
I changed the site colors, added the logo and the tag line, and made other config changes.
Then, I added custom CSS as follows.
To reduce the header spacing:
.header-inner { padding: 1.5rem 0; }
To remove the title from the landing page:
.page-id-2169 .entry-title{ display:none !important; } .page-id-2169 .entry-header { padding: 0; }
To do:
- I need to add social media icons! That might be as easy as adding the social media menu [9].
- Get a related posts widget? There's Yet Another Related Posts Plugin, Contextual Related Posts, and Inline Related Posts... I went with YARPP, as it is the most popular. It is apparently resource-heavy.
I tried the following blocks plugins:
- Ultimate Addons for Guttenberg
- It's free and adds some nice basic functionality
- Post blocks include: Post Carousel, Post Grid, Post Masonry, Post Timeline, Advanced Columns -- but customization is limited and I can't do one post
- Getwid:
- Pretty highly customizable.
- Can specify which posts to show in 3 blocks (Custom Post Type, Post Carousel, and Post Slider) and can build custom templates to arrange how they are displayed
- Post blocks: Recent Posts, Custom Post Type, Post Carousel, and Post Slider
- Redux -- It's a templates library. You get 5 for free and they upsell hard.
- ZeGuten - Couldn't find it
- Advanced Gutenberg - It's free and widely used...
- CoBlocks -- Does the basics
- Posts -- Can't specify specific posts. Can do category.
- Post Carousel -- Likewise.
- Stackable:
- It requested opt-in, which I didn't like, and it wants you to 'Go Premium'.
- It has settings for everything! By far the most detailed configuration.
- Useful blocks:
- Posts -- can't seem to specify a specific post
- Advanced Columns and Grids -- for layout
- Card -- could make posts links with buttons
- Feature/Feature Grid -- likewise
- Container? Might be helpful
- Gutenberg Post Blocks
- Untested with my version. Seems to work.
- Has lots of options but does full-page things. Can limit to a post using include but has next page links...
- Tried to push for an update to pro.
- Magical Posts Display -- I dumped it for being too weird.
- Otter Blocks
- Google maps block and other useful things... I just don't need it right now.
Built-in:
- Latest Posts (widget)
Chosen block plugins:
- Getwid -- It's outstanding and embraces templates for serious bespoke customization
- Stackable -- For its option-based customization
- I might add back coblocks, Advanced Gutenberg and Ultimate Addons for Gutenberg
I installed WP Mail SMTP Lite.
I first set it up to use Google. Essentially you need to sign in to Google and set up an API in the console: https://console.developers.google.com/flows/enableapi?apiid=gmail&pli=1. However, this seemed to introduce a massive security hole unless you have G Suite, so I abandoned this approach.
I had previously set up SMTP through Google for the wiki (See Research_Computing_Configuration#Confirm_Account). So, I used the same approach with Wordpress. In WP Mail SMTP Lite choose 'Other' (see the second method). Then edit wp-config.php to hardcode the values (this ensures that the password, which is stored plain-text, is a little more secure.):
define( 'WPMS_ON', true ); //You MUST set this if you want hardcoded values to work! define( 'WPMS_LICENSE_KEY', ); define( 'WPMS_MAIL_FROM', 'blog@edegan.com' ); define( 'WPMS_MAIL_FROM_FORCE', true ); define( 'WPMS_MAIL_FROM_NAME', 'The Blog at EdEgan.com' ); define( 'WPMS_MAIL_FROM_NAME_FORCE', true ); define( 'WPMS_MAILER', 'smtp' ); // Possible values: 'mail', 'gmail', 'mailgun', 'sendgrid', 'smtp'. define( 'WPMS_SET_RETURN_PATH', true ); define( 'WPMS_SMTP_HOST', 'ssl://smtp.gmail.com' ); define( 'WPMS_SMTP_PORT', 465 ); define( 'WPMS_SSL', 'ssl' ); // Possible values , 'ssl', 'tls' - note TLS is not STARTTLS. define( 'WPMS_SMTP_AUTH', true ); define( 'WPMS_SMTP_USER', 'username@gmail.com' ); // SMTP authentication username, only used if WPMS_SMTP_AUTH is true. define( 'WPMS_SMTP_PASS', 'password generated by Google' ); define( 'WPMS_SMTP_AUTOTLS', true );
Author Comments
The blog supports multiple authors and by default, Wordpress emails an author whenever one of their posts gets a comment. If you'd like to disable author comment emails but keep the moderator emails, there's a simple fix:
Just go to wp-admin/options.php and set 'comments_notify' to 0. (See https://codex.wordpress.org/Option_Reference)
More complicated methods involve writing your own plugin [10] to refine wp_new_comment_notify_postauthor[11] or changing the hooks[12] used in wp-includes/comment.php:
$maybe_notify = apply_filters( 'notify_post_author', $maybe_notify, $comment_ID );
Social Media Integration
Getting the social media icons on the menu and correctly linked up is very straight forward. Follow the guide for twenty-fifteen, which also works for 2020.
Getting some share buttons was more problematic, particular as my planned social media usage is somewhat atypical (Twitter, LinkedIn, and Reddit, really in reverse order), and because I don't want to pay anything.
The free version of Revive Old Posts lets you push content to Twitter and Facebook, but they want you pay to push to LinkedIn.
The best free options seem to be:
- AddToAny Share Buttons - Integrates with Google Analytics
- Simple Social Icons - The simplest option
- Shared Counts -- Counts hits (but using a 3rd party for data?)
- WordPress Social Login - if you want users to log in using their SM accounts (note: has a bimodal ratings distro)
- JetPack -- The plugin used by wordpress.com for this functionality. The free version should suffice, but this thing is a monster. It also uses an account on the wordpress.com cloud, which is a pain for those who are self-hosting.
I went with AddToAny, as it had the most installations, is entirely open-source, and offers all the functionality I need.
Avoiding JetPack
I tried to add a profile picture, but by default, WordPress uses Gravitar, which, surprise, surprise, links to your WordPress.com account... and to add a self-hosted site, you have to install JetPack. At this point, I felt harassed and doubly so because I didn't install JetPack and yet, some how, the profile picture correctly updated from the one I'd posted on Gravitar. What the hecK?
SEO
I used Site Kit plugin for wordpess, and for mediawiki I made a sitemap to submit to Google.
See https://www.mediawiki.org/wiki/Manual:GenerateSitemap.php
In mediawiki:
mkdir sitemap php maintenance/generateSitemap.php --memory-limit=50M --fspath=/var/www/html/mediawiki/sitemap/ --identifier=edegancom --urlpath=/sitemap/ --server=https://www.edegan.com --compress=yes
Then submit it to Google... I did this by making an alias in apache2.conf from sitemap to /var/www/html/mediawiki/sitemap/, then submitting
https://www.edegan.com/sitemap/sitemap-index-edegancom.xml #in retrospect, I wish I'd used an identifier with 'wiki' in it but what the hey.
And with that success behind you, install Google XML Sitemaps on Wordpress, chose some settings (on Settings -> XML-Sitemap), and then post the URL to Google:
https://www.edegan.com/blog/sitemap.xml
It seems Yoast already builds a sitemap, you just need to submit to it Google... (I uninstalled XML Sitemaps):
https://www.edegan.com/blog/sitemap_index.xml
HTTPS
To set up HTTPS using Let's Encrypt, see https://linuxize.com/post/secure-apache-with-let-s-encrypt-on-ubuntu-20-04/
Install it and make some directories...
apt update apt install certbot openssl dhparam -out /etc/ssl/certs/dhparam.pem 2048 takes ~20 secs mkdir -p /var/lib/letsencrypt/.well-known chgrp www-data /var/lib/letsencrypt chmod g+s /var/lib/letsencrypt
Set up the config files
vi /etc/apache2/conf-available/letsencrypt.conf Alias /.well-known/acme-challenge/ "/var/lib/letsencrypt/.well-known/acme-challenge/" <Directory "/var/lib/letsencrypt/"> AllowOverride None Options MultiViews Indexes SymLinksIfOwnerMatch IncludesNoExec Require method GET POST OPTIONS </Directory>
vi /etc/apache2/conf-available/ssl-params.conf SSLProtocol all -SSLv3 -TLSv1 -TLSv1.1 SSLCipherSuite ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA- CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305:DHE-RSA-AES128-GCM-SHA256:DHE-RSA-AES256-GCM-SHA384 SSLHonorCipherOrder off SSLSessionTickets off SSLUseStapling On SSLStaplingCache "shmcb:logs/ssl_stapling(32768)" SSLOpenSSLConfCmd DHParameters "/etc/ssl/certs/dhparam.pem" Header always set Strict-Transport-Security "max-age=63072000"
Enable some apache2 mods!
a2enmod ssl a2enmod headers a2enconf letsencrypt a2enconf ssl-params sudo a2enmod http2 systemctl reload apache2
Run certbot!
certbot certonly --agree-tos --email ed@edegan.com --webroot -w /var/lib/letsencrypt/ -d edegan.com -d www.edegan.com Note that I needed an @ entry in my A record for edegan.com pointed to my IP address to get the main challenge to succeed.
Then set up a new apache2 config file (in /etc/apache):
mv sites-available/000-default.conf sites-available/000-default.conf.bak vi sites-available/edegan.com.conf <VirtualHost *:80> ServerName www.edegan.com ServerAdmin ed@edegan.com Redirect permanent / https://www.edegan.com/ </VirtualHost> <VirtualHost *:443> ServerName www.edegan.com Protocols h2 http/1.1 DocumentRoot /var/www/html ErrorLog ${APACHE_LOG_DIR}/error.log CustomLog ${APACHE_LOG_DIR}/access.log combined SSLEngine On SSLCertificateFile /etc/letsencrypt/live/edegan.com/fullchain.pem SSLCertificateKeyFile /etc/letsencrypt/live/edegan.com/privkey.pem # Other Apache Configuration Alias /wiki /var/www/html/mediawiki/index.php RewriteEngine On RewriteRule ^/*$ %{DOCUMENT_ROOT}/mediawiki/index.php [L] </VirtualHost> ln -s sites-available/edegan.com.conf sites-enabled/edegan.com.conf systemctl reload apache2
Test it by going to https://www.ssllabs.com/ssltest/
Finally, edit /etc/cron.d/certbot and append the following to the last line (after -renew):
--renew-hook "systemctl reload apache2" certbot renew --dry-run Tests the renewal!
PDFEmbed Issue
Enabling and requiring HTTPS causes an issue with PDFEmbed on mediawiki, where you get a blank frame. The PDF is still there, other images load fine, but the PDF frame won't render the PDF. The problem is actually that the PDF is served with HTTP and the rest of the page is served with HTTPS, and Chrome (and perhaps other browsers) don't render the insecure content as a consequence (see [13] for a description of the symptoms, but not the solution.).
The solution is to edit mediawiki/extensions/PDFEmbed/PDFEmbed.hooks.php. For me it was line 103 that previously said:
'src' => $file->getFullUrl().'#page='.$page,
I changed this line to:
'src' => preg_replace("/^http:/i", "https:", $file->getFullUrl()).'#page='.$page,
This is mentioned in a comment on a topic page, though presumably for an earlier version: https://www.mediawiki.org/wiki/Topic:Syxow0why4c0cvvm
Another Issue
Interestingly, I started getting a message from Google Chrome whenever I went to post wiki entries saying: "The information you’re about to submit is not secure". There's an option to "Proceed anyway" or "Go back".
This started after I had MultiTail running viewing apache's logs, but I couldn't see, beyond some kind of file lock examination, how it could work. I figured that it was a coincidence and something else might have happened.
My first thought was that my SSL certificate might have expired. However, the certificate looks valid and good, and the issue survived a reboot.
By inspecting the webpages (in Chrome) and then reviewing the Console, I could see that it was caused by a mixed content problem:
Mixed Content: The page at '<URL>' was loaded over HTTPS, but requested an insecure font '<URL>'. This request has been blocked; the content must be served over HTTPS.
It seemed that I somehow have some font addresses hardcoded somewhere:
Mixed Content: The page at ... was loaded over HTTPS, but requested an insecure font 'http://128.42.44.180/mediawiki/resources/assets/fonts/BonvenoCF-Light.otf'. This request has been blocked; the content must be served over HTTPS.
The copy of Chrome on my desktop must somehow have been upgraded? Or something else changed to cause a change in behavior...
The IP is from the old web server at the McNair Center, suggesting that when I migrated the McNair database into the new wiki, I migrated this issue. (Note that it doesn't appear to be something hardcoded into a .css file, or similar -- I can't find any trace on the filesystem and besides, this wiki was built from a fresh install.)
I found the URLs hardcoded in MediaWiki:Common.css (it must have been moved with the last big batch of pages and I somehow didn't notice!) but then couldn't edit it! It seems that following Mediawiki 1.32, the rights to edit the interface were separated out, and users now need the editinterface right to change anything in the Mediawiki namespace. So, I went to Special:UserRights and gave myself permission. Then I edited the page, which changed the look-and-feel of my editor (I have no idea why), removed the consol messages, but left the problem (even after ctrl-shift-r cache flush on Chrome).
Install VSFTPD
With the security restrictions on wordpress, I now need an FTP server to get files for themes, plugins, etc. I like VSFTPD, as its simple, secure, and has a nice standalone config. Old documentation on an earlier install on the old Wordpress Blog Site (Tool) page. Instructions are here: https://linuxconfig.org/how-to-setup-ftp-server-on-ubuntu-20-04-focal-fossa-linux
apt-get install vsftpd cp /etc/vsftpd.conf /etc/vsftpd.conf_orig vi /etc/vsftpd.conf #Change the following write_enable=YES local_umask=022 ssl_enable=YES #Add the following (forces ssl) allow_anon_ssl=NO force_local_data_ssl=YES force_local_logins_ssl=YES ssl_tlsv1=YES ssl_sslv2=YES ssl_sslv3=YES /etc/init.d/vsftpd restart
Then add a user and set it up:
useradd -m blog passwd blog usermod -a -G www-data blog usermod -d /var/www/html/blog blog
Test it:
ftp 127.0.0.1 sftp 127.0.0.1
See also:
- http://praveen.kumar.in/2009/05/31/setting-up-ftps-using-vsftpd-for-wordpress-plugins-auto-upgrade/
- https://askubuntu.com/questions/14371/how-to-setup-ftp-to-use-in-locally-hosted-wordpress
To address some of the issues with the FTP server's file permissions in wordpress add to wp-config.php:
define( 'WP_CONTENT_DIR', 'wp-content' ); define( 'FTP_BASE', '/var/www/html/blog/' );
If I chmod blog:blog /var/www/html/blog then everything seems to work find when I sftp but wordpress is unable to create a directory... I can't work out why this is happening. I expect it has to do with the need for another wordpress specific define() statement, but I'm spending too much time on it. So I'm going to use direct installation of plugins instead, and remove the FTP server as it is a point of vulnerability.
apt-get remove vsftpd userdel blog
Final Configuration Changes to Apache
Lock down apache somewhat further (as now there are directories that shouldn't be listable, etc.)
cd /etc/apache2 vi apache2.conf #Change the directory definitions. Notes that if -SomeOption is used then other options must have + or - in front of them: <Directory /var/www/html> Options -Indexes +FollowSymLinks AllowOverride All Require all granted </Directory> systemctl reload apache2 #To debug: systemctl status apache2.service
Remove the debug setup
In the wiki (LocalSettings.php), comment the debug lines (I can't see when I added them from the documentation, but if you want to see error messages during the config, you'd want them uncommented):
#error_reporting( -1 ); #ini_set( 'display_errors',1 ); #$wgShowExceptionDetails = true; #$wgShowDBErrorBacktrace = true; #$wgShowSQLErrors = true;
Check the permissions set using $wgGroupPermissions - see https://www.mediawiki.org/wiki/Manual:User_rights
Run all the updates to the blog, etc., from the consol before locking it down. Then in wp-config.php, lock down the ability to install plugins, etc., by commenting:
#define('FS_METHOD','direct');
Edit the .htaccess files in blog and mediawiki to allow access but with appropriate restrictions.
Note that the rewrite rules for the blog are in its .htaccess file
<IfModule mod_rewrite.c> RewriteEngine On RewriteBase /blog/ RewriteRule ^index\.php$ - [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /blog/index.php [L] </IfModule>
To make the blog the default, edit /etc/apache/sites-available-edegan.com.conf, add an alias (don't alias to index.php as it will cause design issues, the rewrite rule for that is already in the .htaccess file!):
Alias /blog /var/www/html/blog/
And change:
RewriteRule ^/*$ %{DOCUMENT_ROOT}/mediawiki/index.php [L]
To:
RewriteRule ^/*$ %{DOCUMENT_ROOT}/blog/index.php [L]
Then:
systemctl reload apache2
Note: Don't change the DocumentRoot to the blog, as this will destroy the design of the wiki. The last rewrite rule will decide the default site!
Useful tools
Multitail
I installed Multitail:
apt-get multitail
The manual is pretty weak, but the examples are good and the feature list is excellent. Here's some useful commands to review log files:
multitail -cS apache -ev "Bot" /var/log/apache2/access.log -ci white -e "Bot" -I /var/log/apache2/access.log multitail -cS apache -ev "Bot" -ev "bot" -ev "internal dummy connection" /var/log/apache2/access.log
Traceroute
apt install traceroute
Note: Zmap seems popular nowadays, based on traffic logs.
Old machines
For the configuration of the servers built for the McNair Center, see the old Center IT page or the pages below:
- Database Server Documentation
- RDP Documentation
- Test Web Server Documentation
- Web Server Documentation
Some of this information is still useful!
In addition, at UC Berkeley, Ed designed and built three machines - two postgresql database servers and a wiki server. The documentation is here:
- Haas PhD Server Configuration
- Posgres Server Configuration -- documents the build of postgres2