Difference between revisions of "Test Web Server Documentation"
imported>Alex |
|||
(117 intermediate revisions by 3 users not shown) | |||
Line 1: | Line 1: | ||
+ | [[Category: McNair Admin]] | ||
+ | |||
Alex's notes from creating a test web server that will eventually host important stuff (aka a test run on a cheap Dell Inspiron). | Alex's notes from creating a test web server that will eventually host important stuff (aka a test run on a cheap Dell Inspiron). | ||
Line 102: | Line 104: | ||
$ sudo mkdir /var/lib/mediawiki | $ sudo mkdir /var/lib/mediawiki | ||
$ sudo mv mediawiki-1.26.2/* /var/lib/mediawiki | $ sudo mv mediawiki-1.26.2/* /var/lib/mediawiki | ||
+ | |||
+ | Then set up the mediawiki directory: | ||
+ | |||
+ | $ cd /var/www/html | ||
+ | $ sudo ln -s /var/lib/mediawiki mediawiki | ||
== Mediawiki Security (1/15/16) == | == Mediawiki Security (1/15/16) == | ||
Line 121: | Line 128: | ||
$wgGroupPermissions['*']['edit'] = false; | $wgGroupPermissions['*']['edit'] = false; | ||
− | and I commented it out so that the IntraACL can manage permissions. I also found a special page in Mediawiki that lists the user group rights. | + | and I commented it out so that the IntraACL can manage permissions. I also found a special page in Mediawiki that lists the user group rights. Mediawiki has [https://www.mediawiki.org/wiki/Manual:User_rights a page] listing the different user rights and user groups. |
+ | |||
+ | The thing is, IntraACL has its own system of groups. So we can either focus entirely on IntraACL groups or try to apply broad changes with Mediawiki groups and LocalSettings.php $wgGroupPermissions. I think we should try going with IntraACL's groups... | ||
+ | |||
+ | == Finding Holes in IntraACL (1/22/16) == | ||
+ | |||
+ | Going through the issues on [https://www.mediawiki.org/wiki/Security_issues_with_authorization_extensions this Mediawiki page] | ||
+ | |||
+ | * Inclusion/Transclusion | ||
+ | |||
+ | * Preloading | ||
+ | |||
+ | * XML Export (Special:Export) | ||
+ | |||
+ | * Atom/RSS Feeds | ||
+ | |||
+ | * Listings & Search | ||
+ | Pages that can't be read still have their titles show up in search auto-complete, but not in search results. Can disable search-box autocomplete, as shown on [https://www.mediawiki.org/wiki/Manual:Enabling_autocomplete_in_search_box this Mediawiki page] | ||
+ | |||
+ | * Diff & Revision Links | ||
+ | |||
+ | * API | ||
+ | |||
+ | * Action Links | ||
+ | |||
+ | * Related Rights | ||
+ | |||
+ | * Author Backdoor | ||
+ | |||
+ | * Caching | ||
+ | |||
+ | * Files & Images | ||
+ | |||
+ | * Redirects | ||
+ | |||
+ | * Edit Section | ||
+ | |||
+ | * Watching Pages | ||
+ | |||
+ | * Other Extensions | ||
+ | |||
+ | == Creating Users (1/25/16) == | ||
+ | |||
+ | Assuming that you have root, you can create user accounts and give them root too. The process is: | ||
+ | |||
+ | First create the users group, checking the last group number: | ||
+ | $ cat /etc/group | ||
+ | $ /usr/sbin/groupadd -g 515 username | ||
+ | |||
+ | Then add the user | ||
+ | $ /usr/sbin/useradd -g username -G root -s /bin/bash -p xxxx -d /home/username -m username | ||
+ | where g is the primary group, G is other groups, p sets a password, | ||
+ | d declares a home directory and m makes the directory | ||
+ | |||
+ | Change the user's password: | ||
+ | $ passwd username | ||
+ | |||
+ | And add the user to the sudoers file | ||
+ | $ echo 'username ALL=(ALL) ALL' >> /etc/sudoers | ||
+ | |||
+ | To delete a user: | ||
+ | $ /usr/sbin/userdel -r roger | ||
+ | where r removes the home directory | ||
+ | |||
+ | And to remove their group | ||
+ | $ /usr/sbin/groupdel username | ||
+ | |||
+ | == Labeled Section Transclusion (1/25/16) == | ||
+ | |||
+ | Ed wanted me to look into Labeled Section Transclusions, which would allow two different sections of a single page to be transcluded onto different locations. There's [https://www.mediawiki.org/wiki/Extension%3aLabeled_Section_Transclusion#How_it_works an extension] for that, but the extension download is only compatible with Mediawiki version 1.19 or newer, and our current webserver is on Mediawiki 1.13.3 (can check Mediawiki version on the [[Special:Version]] page). But I got the LST extension on the test webserver and successfully transcluded different sections. Installing the extension is pretty straightforward: wget the tarball link, untar it, and copy to the /var/lib/mediawiki/extensions directory: | ||
+ | |||
+ | $ cd ~/Downloads | ||
+ | $ wget https://extdist.wmflabs.org/dist/extensions/LabeledSectionTransclusion-REL1_26-60037a2.tar.gz | ||
+ | $ tar -xzvf LabeledSectionTransclusion-REL1_26-60037a2.tar.gz | ||
+ | $ cd /var/lib/mediawiki/extensions | ||
+ | $ cp -r ~/Downloads/LabeledSectionTransclusion ./LabeledSectionTransclusion | ||
+ | |||
+ | Then add this line to LocalSettings.php: | ||
+ | |||
+ | require_once("$IP/extensions/LabeledSectionTransclusion/LabeledSectionTransclusion.php"); | ||
+ | |||
+ | [https://en.wikipedia.org/wiki/Wikipedia:Transclusion#Without_using_the_labeled_section_method This Wikipedia page] documents a method for selective transclusion that doesn't require the extension, but I was unable to replicate the results on the test webserver, so I assume that it requires some package or code that is specific to Wikipedia. | ||
+ | |||
+ | == Responsive Design (1/25/16) == | ||
+ | |||
+ | According to [https://www.mediawiki.org/wiki/Manual:Mobiles,_tablets_and_responsive_design this mediawiki page], the extension that makes Wikipedia look pretty on mobile/tablet is called [https://www.mediawiki.org/wiki/Extension:MobileFrontend MobileFrontend] (and it requires another extension called [https://www.mediawiki.org/wiki/Extension:Mantle Mantle] for Mediawiki versions 1.24 or older). Since the extension only serves up downloads for MW 1.19 or newer, I have to test this on the test webserver (MW 1.26). | ||
+ | |||
+ | As with the LST extension, the installation of the extension is pretty standard stuff: wget the tarball, untar it, and copy it to the extensions directory: | ||
+ | |||
+ | $ cd ~/Downloads | ||
+ | $ wget https://extdist.wmflabs.org/dist/extensions/MobileFrontend-REL1_26-187dae8.tar.gz | ||
+ | $ tar -xzvf MobileFrontend-REL1_26-187dae8.tar.gz | ||
+ | $ cd /var/lib/mediawiki/extensions | ||
+ | $ cp -r ~/Downloads/MobileFrontend ./MobileFrontend | ||
+ | |||
+ | Then, per the installation instructions on the [https://www.mediawiki.org/wiki/Extension:MobileFrontend#Installation extension page], add these two lines to LocalSettings.php: | ||
+ | |||
+ | require_once("$IP/extensions/MobileFrontend/MobileFrontend.php"); | ||
+ | $wgMFAutodetectMobileView = true; | ||
+ | |||
+ | == Short URLs (1/27/16) == | ||
+ | |||
+ | The goal is to get the URLs to look like www.mcnaircenter.org/wiki/Page_Title. Mediawiki has [https://www.mediawiki.org/wiki/Manual:Short_URL/Apache a page] for configuring short URLs on Apache. | ||
+ | |||
+ | First, enable the mod_rewrite Apache module (if it isn't enabled already). In Apache 2.4.7, this can be done from the command line: | ||
+ | |||
+ | $ sudo a2enmod rewrite | ||
+ | |||
+ | You can also check that mod_rewrite has been enabled by creating a phpinfo page and looking for mod_rewrite under the apache "Loaded Modules" section as described in [http://stackoverflow.com/a/10891317 this Stack Overflow answer]. | ||
+ | |||
+ | As described on the Mediawiki manual page (and also by [http://shorturls.redwerks.org/ the web utility] that the manual page recommends), add these lines to the end of the VirtualHost block in the Apache configuration file that defines the wiki's DocumentRoot (I added them to /etc/apache2/sites-available/000-default.conf): (You may also want to make a backup of the original configuration file in case you want to revert) | ||
+ | |||
+ | ## http://www.mediawiki.org/wiki/Manual:Short_URL/Apache | ||
+ | # Enable the rewrite engine | ||
+ | RewriteEngine On | ||
+ | |||
+ | # Short url for wiki pages | ||
+ | RewriteRule ^/?wiki(/.*)?$ %{DOCUMENT_ROOT}/mediawiki/index.php [L] | ||
+ | |||
+ | # Redirect / to Main Page | ||
+ | RewriteRule ^/*$ %{DOCUMENT_ROOT}/mediawiki/index.php [L] | ||
+ | |||
+ | Then restart Apache: | ||
+ | |||
+ | $ sudo service apache2 restart | ||
+ | |||
+ | Find the line in LocalSettings.php that sets the $wgScriptPath and add these two lines below that line: | ||
+ | |||
+ | $wgScriptExtension = ".php"; | ||
+ | $wgArticlePath = "/wiki/$1"; | ||
+ | |||
+ | == Infoboxes (2/1/16) == | ||
+ | |||
+ | Here's the [https://www.mediawiki.org/wiki/Manual:Importing_Wikipedia_infoboxes_tutorial Mediawiki page] about importing infoboxes from Wikipedia (here's a [https://en.wikipedia.org/wiki/Category:Infobox_templates list] of infobox templates and a [https://en.wikipedia.org/wiki/Help:Infobox help page] on infoboxes and [https://en.wikipedia.org/wiki/Help:Designing_infoboxes another help page] on designing infoboxes from Wikipedia). | ||
+ | |||
+ | Installation procedure begins with installing the [https://www.mediawiki.org/wiki/Extension:Scribunto Scribunto extension]: | ||
+ | |||
+ | $ cd ~/Downloads | ||
+ | $ wget https://extdist.wmflabs.org/dist/extensions/Scribunto-REL1_26-4d4766f.tar.gz | ||
+ | $ tar -xzvf Scribunto-REL1_26-4d4766f.tar.gz | ||
+ | $ cd /var/lib/mediawiki/extensions | ||
+ | $ cp -r ~/Downloads/Scribunto ./Scribunto | ||
+ | |||
+ | Then, per the installation instructions on the [https://www.mediawiki.org/wiki/Extension:Scribunto#Installation extension page], add these two lines to LocalSettings.php: | ||
+ | |||
+ | require_once "$IP/extensions/Scribunto/Scribunto.php"; | ||
+ | $wgScribuntoDefaultEngine = 'luastandalone'; | ||
+ | |||
+ | The [https://www.mediawiki.org/wiki/Extension:ParserFunctions ParserFunctions extension] comes with Mediawiki 1.18 or newer, but you still need to load the extension in LocalSettings.php: | ||
+ | |||
+ | wfLoadExtension( 'ParserFunctions' ); | ||
+ | |||
+ | Finally, give all users execute permissions on the Lua binaries that come with the extension (note that you should choose the correct OS in the binaries directory; you can use the uname -m command to determine whether the Linux kernel is 64- or 32-bit) | ||
+ | |||
+ | $ sudo chmod a+x /var/lib/mediawiki/extensions/Scribunto/engines/LuaStandalone/binaries/lua5_1_5_linux_64_generic/lua | ||
+ | |||
+ | And check if SELinux is enabled with sestatus -v. If it is enabled, set the type: | ||
+ | |||
+ | $ chcon -t httpd_sys_script_exec_t /var/lib/mediawiki/extensions/Scribunto/engines/LuaStandalone/binaries/lua5_1_5_linux_64_generic/lua | ||
+ | |||
+ | Go to Wikipedia's [https://en.wikipedia.org/wiki/Special:Export Special:Export page] and list the infobox templates that you want to export. For example, I chose to export the following templates, and checked the options "Include only the current revision, not the full history", "Include templates", and "Save as file": | ||
+ | |||
+ | Template:Infobox | ||
+ | Template:Infobox/doc | ||
+ | Template:Infobox person | ||
+ | Template:Infobox person/doc | ||
+ | Template:Infobox company | ||
+ | Template:Infobox company/doc | ||
+ | |||
+ | Then go to the test wiki's Special:Import page and choose the XML file that you got from the step above. I chose to "Import to default locations". | ||
+ | |||
+ | The import threw a 500 Server Error, but when I checked the Special:AllPages list of all pages in the Template namespace, the entries for the Template:Infobox, etc. pages show up but when I try to see them, I get another 500 Server Error. | ||
+ | |||
+ | == Debugging Special:Import Errors (2/1/16 - 2/5/16) == | ||
+ | |||
+ | I already had created a phpinfo page for debugging as described in [http://stackoverflow.com/a/10891317 this Stack Overflow answer]. (Note that $ php -i | grep 'what_you_seek' also works, but the command line interface or CLI may use a different configuration file.) | ||
+ | |||
+ | Note that the phpinfo page shows the location of the Apache error logs (look for APACHE_LOG_DIR) and the location of the php.ini configuration file (look for Configuration File). | ||
+ | |||
+ | Checking the Apache error logs reveals that the Import failed with this error: | ||
+ | |||
+ | PHP Fatal Error: Call to undefined function pcntl_wifsignaled() in /var/lib/mediawiki/extensions/Scribunto/engines/LuaStandalone/LuaStandaloneEngine.php on line 645, referer: http://128.42.44.22/wiki/Special:Import | ||
+ | |||
+ | Looking for pcntl_wifsignaled in the phpinfo page reveals that it is disabled. I check that pcntl is actually installed: | ||
+ | |||
+ | $ php -m | grep pcntl | ||
+ | |||
+ | Then I try to comment out the line in the php.ini configuration file that disables the pcntl functions: | ||
+ | |||
+ | ;disable_functions = pcntl_alarm,pcntl_fork,pcntl_waitpid,pcntl_wait,pcntl_wifexited,pcntl_wifstopped,pcntl_wifsignaled,pcntl_wexitstatus,pcntl_wtermsig,pcntl_wstopsig,pcntl_signal,pcntl_signal_dispatch,pcntl_get_last_error,pcntl_strerror,pcntl_sigprocmask,pcntl_sigwaitinfo,pcntl_sigtimedwait,pcntl_exec,pcntl_getpriority,pcntl_setpriority, | ||
+ | |||
+ | But that also doesn't work (Import gives 500 error again and I still can't view the page from the Special:AllPages list of all pages in the Template namespace due to another 500 error) | ||
+ | |||
+ | '''Day 2:''' | ||
+ | |||
+ | I try adding the following line to the php.ini configuration file under the "Dynamic Extensions" section to load the pcntl extension: | ||
+ | |||
+ | extension=pcntl.so | ||
+ | |||
+ | The import still doesn't work (another 500 error). This [http://stackoverflow.com/a/8432855 SO answer] gives another way to check that Process Control (aka pcntl) is enabled: | ||
+ | |||
+ | $ php --ri pcntl | ||
+ | |||
+ | And it is enabled. Okay so backtracking a little. Here's a [http://trog.qgl.org/20140923/setting-up-infobox-templates-in-mediawiki-v1-23/ post] that mentions importing [https://en.wikipedia.org/wiki/MediaWiki:Common.css this stylesheet] from Wikipedia (the source wiki, aka where we're getting our infobox templates from). I exported that page with the three options -- "Include only the current revision, not the full history", "Include templates", and "Save as file" -- all checked. Importing the resulting XML file was successful. But trying to import the infobox templates immediately afterwards still failed (500 error). | ||
+ | |||
+ | And here's a random tangent: an [http://trog.qgl.org/20110815/setting-up-infobox-templates-in-mediawiki/ older version] of the post above mentions that the XML file needs to be modified: all of the instances of "text/plain" need to be replaced with "CONTENT_TEXT_FORMAT". There's a thread on Mediawiki that suggests the same fix (the suggestion is also about 3 years old, so it may be just referring to the old version of the post without citing it). The thing that makes me uncertain is that Scribunto's ticket trackers have some tickets about this suggested fix: [https://phabricator.wikimedia.org/T58249 this one] and [https://phabricator.wikimedia.org/T53504 this one] came up when I searched for "content_text_format scribunto". Both seem to conclusively suggest that the fix isn't necessary in newer versions of Scribunto and Mediawiki, and there's even a backport left there for people who are still trying the fix. So it seems that the fix won't actually change anything. | ||
+ | |||
+ | Well, since I imported the Common.css stylesheet, and the [https://www.mediawiki.org/wiki/Manual:Importing_Wikipedia_infoboxes_tutorial mediawiki infobox tutorial] suggests importing a javascript file as well, might as well try doing that. I exported [https://en.wikipedia.org/wiki/MediaWiki:Common.js the javascript] from Wikipedia with the three options -- "Include only the current revision, not the full history", "Include templates", and "Save as file" -- all checked. Importing that XML is also successful. | ||
+ | |||
+ | Importing the infobox templates still fails (500 error). Back to the Apache error logs... '''*sigh*''' | ||
+ | |||
+ | I'm dealing with the same 'undefined function pcntl_wifsignaled()' error that I faced last time, so I didn't make any progress. | ||
+ | |||
+ | Well, there's a few concerning things. | ||
+ | # I'm running all of my diagnostic commands (like $ php -m | grep pcntl or $ php --ri pcntl) on the command line, which uses a different php.ini configuration file. Which may have no correlation to the apache2 php.ini configuration file (though they appear to be almost identical). | ||
+ | # The undefined function pcntl_wifsignaled() is called by a function in LuaStandAlone.php called handleIOError(), which means that the pcntl_wifsignaled() error may be just a side effect of another error. | ||
+ | |||
+ | But wait, there's an error in the apache error log about failing to load pcntl.so: | ||
+ | |||
+ | PHP Warning: PHP Startup: Unable to load dynamic library '/usr/lib/php5/20121212/pcntl.so' - /usr/lib/php5/20121212/pcntl.so: cannot open shared object file: No such file or directory in Unknown on line 0 | ||
+ | |||
+ | So adding the "extension=pcntl.so" line didn't really help (as I figured). I need to get the pcntl.so file. [http://php.net/manual/en/pcntl.installation.php PHP docs for Process Control] say you need a special option when compiling PHP to enable pcntl. | ||
+ | |||
+ | $ cd ~/Downloads | ||
+ | $ wget http://php.net/get/php-5.5.9.tar.gz/from/this/mirror | ||
+ | $ tar -zxvf php-5.5.9.tar.gz | ||
+ | $ cd php-5.5.9/ext/pcntl | ||
+ | $ sudo apt-get install php5-dev | ||
+ | $ phpize | ||
+ | $ ./configure | ||
+ | $ make | ||
+ | $ sudo cp pcntl.so /usr/lib/php5/20121212 | ||
+ | |||
+ | == A Miraculous Fix (2/8/16) == | ||
+ | |||
+ | So I loaded the Template:Infobox/doc page on the test wiki this afternoon, and it miraculously loaded! It still has a Lua script error, but at least it's not just 500 server errors all the way. The Lua script error seems to be a timeout error, and from some google searching, it seems that the default timeout length is 10 seconds for Lua, whereas markup-based templates have a 60 second timeout limit. | ||
+ | |||
+ | Importing the XML file with the Infobox templates now works as well. Really not sure how this worked... | ||
+ | |||
+ | == Configuring Image Uploads (2/8/16) == | ||
+ | |||
+ | Might as well try to make some infoboxes. But I need to configure file uploads. There's [https://www.mediawiki.org/wiki/Manual:Configuring_file_uploads a Mediawiki page] that I followed pretty closely. | ||
+ | |||
+ | First, I went to the php.ini configuration file and checked that file_uploads was set to On and noted that open_basedir isn't set. | ||
+ | |||
+ | Then I set the permissions for the images directory to 755 with: | ||
+ | |||
+ | $ sudo chmod -R 755 /var/lib/mediawiki/images | ||
+ | |||
+ | And I also added these lines to the apache2.conf configuration file: | ||
+ | <Directory /var/www/wiki/images> | ||
+ | Options -Indexes | ||
+ | </Directory> | ||
+ | |||
+ | Then I set the $wgEnableUploads options to true in LocalSettings.php. | ||
+ | |||
+ | When I try to upload a file, however, I get an exception. | ||
+ | |||
+ | I added this line to LocalSettings.php to print a backtrace: | ||
+ | |||
+ | $wgShowExceptionDetails = true; | ||
+ | |||
+ | And the resulting backtrace/error message: | ||
+ | |||
+ | [b9bfa0ee] /wiki/Special:Upload MWException from line 1873 of /var/lib/mediawiki/includes/filerepo/file/LocalFile.php: Could not acquire lock for 'File-Donald_August_19_(cropped).jpg.' | ||
+ | |||
+ | Backtrace: | ||
+ | |||
+ | #0 /var/lib/mediawiki/includes/filerepo/file/LocalFile.php(1152): LocalFile->lock() | ||
+ | #1 /var/lib/mediawiki/includes/upload/UploadBase.php(708): LocalFile->upload(string, string, string, integer, array, boolean, User) | ||
+ | #2 /var/lib/mediawiki/includes/specials/SpecialUpload.php(488): UploadBase->performUpload(string, string, boolean, User) | ||
+ | #3 /var/lib/mediawiki/includes/specials/SpecialUpload.php(197): SpecialUpload->processUpload() | ||
+ | #4 /var/lib/mediawiki/includes/specialpage/SpecialPage.php(384): SpecialUpload->execute(NULL) | ||
+ | #5 /var/lib/mediawiki/includes/specialpage/SpecialPageFactory.php(553): SpecialPage->run(NULL) | ||
+ | #6 /var/lib/mediawiki/includes/MediaWiki.php(281): SpecialPageFactory::executePath(Title, RequestContext) | ||
+ | #7 /var/lib/mediawiki/includes/MediaWiki.php(714): MediaWiki->performRequest() | ||
+ | #8 /var/lib/mediawiki/includes/MediaWiki.php(508): MediaWiki->main() | ||
+ | #9 /var/lib/mediawiki/index.php(41): MediaWiki->run() | ||
+ | #10 {main} | ||
+ | |||
+ | Google searches yield [https://www.mediawiki.org/wiki/Thread:Project:Support_desk/File_Upload_Error this thread] and [https://www.mediawiki.org/wiki/Thread:Project:Support_desk/Problem_With_File_Upload:_Could_not_acquire_lock_for_%22mwstore://local-backend/local-public/1/1e%22. this thread] dealing with this error message. | ||
+ | |||
+ | == Installing Ghost (3/14/2016) == | ||
+ | |||
+ | You need to make sure you have the correct version of node installed (should be a version of node that Ghost supports; at time of writing, it should be 0.10.x). Credits to [http://www.hostingadvice.com/how-to/install-nodejs-ubuntu-14-04/ this page] for helping me out. | ||
+ | |||
+ | $ sudo apt-get install nodejs | ||
+ | $ nodejs -v | ||
+ | |||
+ | Go ahead and try to remove the old version of node and then clean up any unused packages. | ||
+ | |||
+ | $ sudo apt-get remove --purge node | ||
+ | $ sudo apt-get autoremove | ||
+ | |||
+ | Make a symbolic link from nodejs to node | ||
+ | |||
+ | $ sudo ln -s /usr/bin/nodejs /usr/bin/node | ||
+ | |||
+ | Install npm too and check the version: | ||
+ | |||
+ | $ sudo apt-get install npm | ||
+ | $ npm -v | ||
+ | |||
+ | Now you can follow the [http://support.ghost.org/installing-ghost-linux/ instructions] for installing Ghost! | ||
+ | |||
+ | $ curl -L https://ghost.org/zip/ghost-latest.zip -o ghost.zip | ||
+ | $ sudo apt-get install unzip | ||
+ | $ sudo mkdir /var/www/ghost | ||
+ | $ unzip -uo ghost.zip -d /var/www/ghost | ||
+ | $ cd /var/www/ghost | ||
+ | $ sudo npm install | ||
+ | $ sudo npm start | ||
+ | |||
+ | Note that we had to install the unzip package. I also chose to not install in the [http://support.ghost.org/config/#about-environments production environment] so that I would have more debugging info and the ability to tinker with the theming. | ||
+ | |||
+ | == Ghost on Apache? (3/14/2016) == | ||
+ | |||
+ | So Ed would rather not serve the blog off of port 2368. Looks like there's documentation for setting Ghost up on [http://support.ghost.org/basic-nginx-config/ nginx] and [https://www.howtoinstallghost.com/how-to-host-ghost-on-an-apache-subdomain/ apache]. Since we would rather not move the mediawiki off of apache, let's just try doing Ghost and apache. | ||
+ | |||
+ | Turns out that doing so is a lot more hassle than I initially thought. Ghost and nginx may be easier, and maybe Ghost+apache isn't even that bad, but it's definitely more involved, especially when setting it up alongside another site (the mediawiki). | ||
+ | |||
+ | [https://www.howtoinstallghost.com/how-to-host-ghost-on-an-apache-subdomain/ This page] looked helpful, and [https://www.howtoforge.com/tutorial/how-to-install-ghost-blog-on-ubuntu-15.10/#step-install-ghost-blog this post] has a complete tutorial if you want to go through with it, but it seems difficult to get Ghost set up correctly on Apache. | ||
+ | |||
+ | == Installing WordPress (3/14/2016) == | ||
+ | |||
+ | Following the [http://codex.wordpress.org/Installing_WordPress#Detailed_Instructions Detailed Instructions] to install WordPress was enough to get me started: | ||
+ | |||
+ | $ cd ~/Downloads | ||
+ | $ wget https://wordpress.org/latest.tar.gz | ||
+ | $ tar -xzvf latest.tar.gz | ||
+ | |||
+ | Configure a database for WordPress (can be called something other than wordpress) and make a new MySQL user (can be called something other than mcnair_wp) that has all permissions for the wordpress database. Obviously, you should replace a_secure_password with an actual password for the user (but leave the quotes around the password when typing the MySQL command). FLUSH PRIVILEGES reloads the permissions tables. | ||
+ | |||
+ | $ mysql -u root -p | ||
+ | Enter password: | ||
+ | |||
+ | mysql> CREATE DATABASE wordpress; | ||
+ | mysql> GRANT ALL PRIVILEGES ON wordpress.* TO "mcnair_wp"@"localhost" IDENTIFIED BY "password"; | ||
+ | mysql> FLUSH PRIVILEGES; | ||
+ | mysql> EXIT | ||
+ | |||
+ | You can verify that the wordpress database and user were created correctly by logging into the mysql client commandline interface using the new user: | ||
+ | |||
+ | $ mysql -u mcnair_wp -p | ||
+ | Enter password: | ||
+ | |||
+ | mysql> SHOW DATABASES; | ||
+ | mysql> USE wordpress; | ||
+ | mysql> EXIT | ||
+ | |||
+ | Make a wp-config.php by making a copy of the wp-config-sample.php file and renaming it: | ||
+ | |||
+ | $ cp ~/Downloads/wordpress/wp-config-sample.php ~/Downloads/wordpress/wp-config.php | ||
+ | $ sudo vi ~/Downloads/wordpress/wp-config.php | ||
+ | |||
+ | Edit the lines that define the DB_NAME, DB_USER, and DB_PASSWORD constants to have the values that you used to set up the MySQL database and user above. | ||
+ | |||
+ | Copy the wordpress directory to /var/lib/wordpress and then make a symlink from /var/www/blog to /var/lib/wordpress (much like how the mediawiki was done) so that http://128.42.44.22/blog points to the WP blog: | ||
+ | |||
+ | $ sudo cp -r ~/Downloads/wordpress /var/lib/wordpress | ||
+ | $ cd /var/www/html | ||
+ | $ sudo ln -s /var/lib/wordpress blog | ||
+ | |||
+ | Navigate a browser to http://128.42.44.22/blog/wp-admin/install.php to complete the installation (you'll be asked to create an admin user for the WordPress site). | ||
+ | |||
+ | == Installing Open Web Analytics (3/21/2016) == | ||
+ | |||
+ | $ cd ~/Downloads | ||
+ | $ git clone https://github.com/padams/Open-Web-Analytics.git | ||
+ | $ cd /var/lib/mediawiki/extensions | ||
+ | $ cp -r ~/Downloads/Open-Web-Analytics ./owa | ||
+ | |||
+ | edit LocalSettings.php and add the following line: | ||
+ | |||
+ | require_once('extensions/owa/mw_plugin.php'); | ||
+ | |||
+ | Go to the list of Special Pages on the mediawiki and click on the Open Web Analytics special page to install OWA. | ||
+ | |||
+ | Long story short, this extension has only been [https://github.com/padams/Open-Web-Analytics/wiki/MediaWiki-Integration tested] up to Mediawiki version 1.16. I tried so hard, and got so far, but in the end, it doesn't even matter. | ||
+ | |||
+ | == Installing Piwik (3/21/2016) == | ||
+ | |||
+ | Installing Piwik itself ([https://piwik.org/docs/installation/#start-the-installation instructions from Piwik]): | ||
+ | |||
+ | $ cd ~/Downloads | ||
+ | $ wget http://builds.piwik.org/piwik.zip && unzip piwik.zip | ||
+ | $ sudo cp -r ~/Downloads/piwik /var/lib/piwik | ||
+ | $ cd /var/lib/piwik | ||
+ | $ sudo chmod 777 tmp | ||
+ | $ cd /var/www/html | ||
+ | $ sudo ln -s /var/lib/piwik analytics | ||
+ | |||
+ | Navigate a browser to http://128.42.44.22/analytics and go through the Piwik installation. Make sure you fix everything on the "System Checks" page. | ||
+ | |||
+ | When you get to the Database Setup page, you'll need to configure a MySQL database for Piwik. If you followed the steps for configuring a WordPress database, the steps are almost identical. | ||
+ | |||
+ | Configure a database for Piwik (can be called something other than piwik) and make a new MySQL user (can be called something other than mcnair_piwik) that has all permissions for the piwik database. Obviously, you should replace a_secure_password with an actual password for the user (but leave the quotes around the password when typing the MySQL command). FLUSH PRIVILEGES reloads the permissions tables. | ||
+ | |||
+ | $ mysql -u root -p | ||
+ | Enter password: | ||
+ | |||
+ | mysql> CREATE DATABASE piwik; | ||
+ | mysql> GRANT ALL PRIVILEGES ON piwik.* TO "mcnair_piwik"@"localhost" IDENTIFIED BY "password"; | ||
+ | mysql> FLUSH PRIVILEGES; | ||
+ | mysql> EXIT | ||
+ | |||
+ | You can verify that the piwik database and user were created correctly by logging into the mysql client commandline interface using the new user: | ||
+ | |||
+ | $ mysql -u mcnair_piwik -p | ||
+ | Enter password: | ||
+ | |||
+ | mysql> SHOW DATABASES; | ||
+ | mysql> USE piwik; | ||
+ | mysql> EXIT | ||
+ | |||
+ | Installing the Piwik Integration extension for Mediawiki: | ||
+ | |||
+ | $ cd ~/Downloads | ||
+ | $ wget https://github.com/DaSchTour/piwik-mediawiki-extension/archive/master.zip | ||
+ | $ unzip -uo master.zip -d /var/lib/mediawiki/extensions | ||
+ | $ cd /var/lib/mediawiki/extensions | ||
+ | $ mv piwik-mediawiki-extension-master/ Piwik/ | ||
+ | |||
+ | edit LocalSettings.php to add these lines: | ||
+ | |||
+ | require_once("$IP/extensions/Piwik/Piwik.php"); | ||
+ | $wgPiwikURL = "128.42.44.22/analytics/"; | ||
+ | $wgPiwikIDSite = "1"; | ||
+ | |||
+ | But it doesn't seem to register the visits... | ||
+ | |||
+ | Turns out Piwik by default honors DoNotTrack (as I learned [http://piwik.org/faq/troubleshooting/faq_58/ here]), so my browser wouldn't register as a visit. So there's visits now. Yay! | ||
+ | |||
+ | Also, to make the little graphs next to the numbers not be broken, you have to get the most recent version of the GD module for PHP: | ||
+ | $ sudo apt-get install php5-gd | ||
+ | $ sudo service apache2 restart | ||
+ | |||
+ | == Installing Google Analytics (3/23/2016) == | ||
+ | |||
+ | $ cd ~/Downloads | ||
+ | $ wget https://extdist.wmflabs.org/dist/extensions/googleAnalytics-REL1_26-d832801.tar.gz | ||
+ | $ tar -xzvf googleAnalytics-REL1_26-d832801.tar.gz | ||
+ | $ cd /var/lib/mediawiki/extensions | ||
+ | $ cp -r ~/Downloads/googleAnalytics ./GoogleAnalytics | ||
+ | |||
+ | Add these lines to LocalSettings.php: | ||
+ | |||
+ | require_once("$IP/extensions/GoogleAnalytics/googleAnalytics.php"); | ||
+ | // Replace xxxxxxx-x with YOUR GoogleAnalytics UA number | ||
+ | $wgGoogleAnalyticsAccount = 'UA-xxxxxxx-x'; | ||
+ | |||
+ | == Installing Semantic Mediawiki (3/25/2016) == | ||
+ | |||
+ | The [https://www.semantic-mediawiki.org/wiki/Help:Installation/Using_Composer_with_MediaWiki_1.25%2B installation process] looks complicated, but let's be careful. First up, we're going to [https://getcomposer.org/doc/00-intro.md#globally install Composer globally]. | ||
+ | |||
+ | $ cd ~/Downloads | ||
+ | |||
+ | and then follow the download instructions [https://getcomposer.org/download/ here] to install Composer to the ~/Downloads directory. Then move the composer.phar executable to a directory in our path: | ||
+ | |||
+ | $ sudo mv ~/Downloads/composer.phar /usr/local/bin/composer | ||
+ | $ composer --version | ||
+ | $ composer --list | ||
+ | |||
+ | Now you can just call "composer" instead of doing "php /path/to/composer/composer.phar" | ||
+ | |||
+ | Proceeding with the install for SMW... | ||
+ | |||
+ | $ cd /var/lib/mediawiki | ||
+ | $ sudo composer require mediawiki/semantic-media-wiki "~2.3" --update-no-dev | ||
+ | |||
+ | Make sure to replace "~2.3" with the appropriate latest release version (ignoring the third number, e.g. if the latest release is 2.3.1, then use "~2.3"). | ||
+ | |||
+ | $ php maintenance/update.php | ||
+ | $ sudo vi LocalSettings.php | ||
+ | |||
+ | And add this line to the bottom of LocalSettings.php: | ||
+ | enableSemantics('domain_name.com'); | ||
+ | |||
+ | Check to see if the mediawiki site recognizes that the extension has been installed by visiting the Special:Version page. | ||
+ | |||
+ | You can test that the SMW annotations are working by following the instructions on this page: https://www.semantic-mediawiki.org/wiki/Help:Testing | ||
+ | |||
+ | == Installing Semantic Forms (3/25/2016) == | ||
+ | |||
+ | Don't install from Mediawiki's Extension Distributor (according to the [https://www.mediawiki.org/wiki/Extension:Semantic_Forms/Download_and_installation documentation])! Instead, get from the Git repository: | ||
+ | |||
+ | $ cd ~/Downloads | ||
+ | $ git clone https://git.wikimedia.org/git/mediawiki/extensions/SemanticForms.git | ||
+ | $ cp -r ./SemanticForms /var/lib/mediawiki/extensions/SemanticForms | ||
+ | $ cd /var/lib/mediawiki | ||
+ | $ sudo vi LocalSettings.php | ||
+ | |||
+ | Add the following line to LocalSettings.php: | ||
+ | |||
+ | include_once("$IP/extensions/SemanticForms/SemanticForms.php"); | ||
+ | |||
+ | == Semantic Forms Examples (3/28/2016) == | ||
+ | |||
+ | There's an [https://www.mediawiki.org/wiki/Extension:Semantic_Forms/Quick_start_guide#Example example data structure] on the Semantic Forms page. I followed most of the steps, except for '''Enabling links to forms'''. I couldn't get #formredlink to work properly (the template wasn't parsing the silent property declaration with #set properly), so I instead added a line to the "Was written by" property page: | ||
+ | |||
+ | <nowiki>[[Creates pages with form::Author]]</nowiki> | ||
+ | |||
+ | And this way, once a new Book page is created, the redlink for the book's author (when clicked) automatically generates an Author page. | ||
+ | |||
+ | I also added #default_form lines at the end of the Book and Author template pages, e.g. for the Author template page, the last few lines looked like: | ||
+ | <nowiki> | ||
+ | ... | ||
+ | [[Category:Authors]] | ||
+ | {{#default_form:Author}} | ||
+ | </includeonly> | ||
+ | </nowiki> | ||
+ | and I did something similar for the Book template page. This way, every Book and Author page will have a "Edit with form" tab in addition to the "Edit" tab (and the "edit with form" tab is significantly more useful). | ||
+ | |||
+ | == Installing Cargo (3/28/2016) == | ||
+ | |||
+ | $ cd ~/Downloads | ||
+ | $ git clone https://git.wikimedia.org/git/mediawiki/extensions/Cargo.git | ||
+ | $ cp -r ./Cargo /var/lib/mediawiki/extensions/Cargo | ||
+ | $ cd /var/lib/mediawiki | ||
+ | $ sudo vi LocalSettings.php | ||
+ | |||
+ | Add the following line to the LocalSettings.php configuration file: | ||
+ | |||
+ | require_once( "$IP/extensions/Cargo/Cargo.php" ); | ||
+ | |||
+ | Then back to the console to do some PHP updating: | ||
+ | |||
+ | $ php maintenance/update.php | ||
+ | |||
+ | == Cargo Examples (3/28/2016) == | ||
+ | |||
+ | First, remove Semantic Mediawiki by going to the composer.json file in the Mediawiki root directory and deleting the line that requires the semantic mediawiki package. then run <code>sudo composer update</code> from the Mediawiki root directory. | ||
+ | |||
+ | Create templates using the special page under the Semantic Forms category. Then to create the data tables, you go to each template page and choose "Create data table" from the dropdown next to the edit tab. After creating the data table once, it seems that you can create the data from the command line (if you do this before creating the data tables, nothing happens...): | ||
+ | |||
+ | $ cd /var/lib/mediawiki/extensions/Cargo/maintenance | ||
+ | $ php cargoRecreateData.php | ||
+ | |||
+ | You'll have to edit the template pages to add queries, but at that point, you may just want to write the templates yourself. | ||
+ | |||
+ | As with Semantic Mediawiki + Semantic Forms, you can add the #default_form parser function to the template page to display an "edit with form" tab alongside the "edit" tab (you'll likely have to refresh the page to see the changes). | ||
+ | |||
+ | == Cargo Data structuring (3/30/2016) == | ||
+ | |||
+ | Sahil and I came up with a SQL database schema for the organizations and events in the startup ecosystem. Organization subtypes are: startups, VC funds, accelerators, incubators, service providers. Event subtypes are: financing, training, liquidity. Each subtype has fields specific to it, but all organizations need to have a name, logo, URL, address, founding date, and status, and all events need to have a date and need to include which organizations are involved. | ||
+ | |||
+ | We tried doing foreign keys, but you can't do that with Cargo, so maybe we should look into other options. One easy way out would be to just duplicate the columns that are common to all organizations in each organization subtype's table. but this seems like bad practice. I found [https://www.mediawiki.org/wiki/Extension:Cargo/Storing_data#Attaching_to_a_table #cargo_attach] which may help us in this sort of situation. | ||
+ | |||
+ | There is an issue with using <code>#cargo_query</code> in the test wiki. When a new page is created that is related to a page that is already in existence, the page that is already in existence should update automatically to display that it is related to the new page. Instead, the old page won't display its relation to the new page until someone goes into the old page, hits the edit button, and saves an edit to the page. Even if there is no change in the page text from the previous version, the old page will now display it's relation to the new page. For example on the page [http://128.42.44.22/wiki/Bolt Bolt], you can see it is directed by Byron Howard, and on the page [http://128.42.44.22/wiki/Byron_Howard Byron Howard], you can see that Bolt is one of his films. This is working as intended; however, on the page [http://128.42.44.22/wiki/Tangled Tangled] it shows that Tangled is also directed by Byron Howard. On Byron Howard's page, it did not show that Tangled was one of his films until I edited the page. It then updated to show Tangled as one of his films. | ||
+ | |||
+ | It seems like the above problem is called because when <code>#cargo_query</code> is used on a page, the query is not called each time the page is refreshed, but rather only when an edit is saved to the page and then saved from then on. This could cause display issues with pages for accelerator's in the future not displaying new companies added. | ||
+ | |||
+ | == Back to Semantic Mediawiki (4/8/2016) == | ||
+ | |||
+ | Semantic Mediawiki might be better for the inheritance (i.e. foreign keys in SQL). In SMW, we can define the properties and templates for a superclass and subclass. If we make a form that includes both templates, the form creates pages that has the templates for the superclass and subclass included, and the properties are all there. This might be the solution we're looking for, but it seems to be more difficult to query, since there is no explicit distinction between the attributes common to all subclasses and the attributes specific to a single subclass. | ||
+ | |||
+ | Ed likes it so far. We should just move forward with the actual data structure on SMW until major roadblocks prevent further progress. | ||
+ | |||
+ | Some notes on why SMW seems to be more flexible than Cargo: the properties in SMW are responsible for storing data, whereas the templates in Cargo are responsible for storing data (using #cargo_store calls). This means that a Semantic Form using SMW properties that includes multiple templates is okay, whereas an Semantic Form using Cargo tables needs to ensure that the templates are each affecting different tables, which isn't the case for inheritance. | ||
+ | |||
+ | == To-do list == | ||
+ | |||
+ | * Google Analytics on the test blog. we need FTP access (port 21) to be able to install new plugins, apparently... | ||
+ | |||
+ | [[admin_classification::IT Build| ]] |
Latest revision as of 18:39, 20 March 2017
Alex's notes from creating a test web server that will eventually host important stuff (aka a test run on a cheap Dell Inspiron).
Contents
- 1 Installing Ubuntu (12/18/15 - 1/1/16)
- 2 Installing the LAMP stack (1/1/16)
- 3 Network troubleshooting (1/2/16 - 1/3/16)
- 4 Setting up SSH remote connection (1/3/16)
- 5 Installing Mediawiki (1/4/16)
- 6 Mediawiki Security (1/15/16)
- 7 Installing IntraACL (1/20/16)
- 8 Understanding IntraACL (1/22/16)
- 9 Finding Holes in IntraACL (1/22/16)
- 10 Creating Users (1/25/16)
- 11 Labeled Section Transclusion (1/25/16)
- 12 Responsive Design (1/25/16)
- 13 Short URLs (1/27/16)
- 14 Infoboxes (2/1/16)
- 15 Debugging Special:Import Errors (2/1/16 - 2/5/16)
- 16 A Miraculous Fix (2/8/16)
- 17 Configuring Image Uploads (2/8/16)
- 18 Installing Ghost (3/14/2016)
- 19 Ghost on Apache? (3/14/2016)
- 20 Installing WordPress (3/14/2016)
- 21 Installing Open Web Analytics (3/21/2016)
- 22 Installing Piwik (3/21/2016)
- 23 Installing Google Analytics (3/23/2016)
- 24 Installing Semantic Mediawiki (3/25/2016)
- 25 Installing Semantic Forms (3/25/2016)
- 26 Semantic Forms Examples (3/28/2016)
- 27 Installing Cargo (3/28/2016)
- 28 Cargo Examples (3/28/2016)
- 29 Cargo Data structuring (3/30/2016)
- 30 Back to Semantic Mediawiki (4/8/2016)
- 31 To-do list
Installing Ubuntu (12/18/15 - 1/1/16)
I chose the 14.04.3 (aka "Trusty Tahr", sometimes abbreviated as "trusty" in online package documentation) Ubuntu Server ISO image for the installation process. I did so after unsuccessfully trying to install from the Minimal ISO image. I found this Ubuntu documentation page helpful during the install, and I even did a MD5 checksum verification for the first time to make sure I downloaded the ISO image correctly.
The menus for both installations were almost identical (go figure), but the Server ISO image offered a subset of the choices presented by the Minimal ISO image (for example, the Minimal installation asks about shadow passwords and Linux kernels), in a slightly different order. Here are some of the less obvious choices that I made when installing from the Server ISO image; the remaining choices were either locale-based (e.g. time zone, keyboard layout, etc.) or network configuration (I chose the wired connection as the primary interface):
- hostname: mcnairtestwebserver
- encrypted home directory? no
- how to partition disk? Guided - use entire disk and set up LVM
- how much of volume group should be used for guided partitioning? 75%
- automatic updates? no
As a note, I did have to turn off Secure boot in the Dell's UEFI firmware menu (the one that you get to by mashing F2 when the computer just turned on and shows you the Dell logo) because Ubuntu kept having Kernel panics about something attempting to kill init. This probably won't be a problem on the production server, but for replicatability's sake, now you know.
Installing the LAMP stack (1/1/16)
With the Server ISO image, one of the options during installation is to select packages that you might want to install. Obviously, I checked the LAMP server box. The Minimal installation has a similar screen, except with more options (mostly desktop-related packages, like GUIs and fonts). And now you have all that you need. Don't forget to update the package manager and upgrade all your packages:
$ sudo apt-get update $ sudo apt-get upgrade
Network troubleshooting (1/2/16 - 1/3/16)
I'm not sure if this is my house's ethernet or my own fault, but I spent a lot of time digging into Ubuntu network configuration. I know it can work: I installed packages during the installation of Ubuntu! Anyways, with the help of the internet (this thread and this SO question were useful, as was this Ubuntu documentation) and man pages, here's a quick troubleshooting guide:
First, some diagnostics commands:
$ ping google.com $ ping localhost $ hostname -i $ ifconfig
Check some relevant configuration files (note that my ethernet connection interface is named "p3p1" and is configured for DHCP instead of a static IP address):
$ cat /etc/resolv.conf $ cat /etc/hosts $ cat /etc/network/interfaces $ cat /etc/dhcp/dhclient.conf $ cat /var/lib/dhcp/dhclient.p3p1.leases
Next, try editing /etc/network/interfaces with sudo vi. I added two lines (Google's public DNS addresses and the DNS domain name of my network) into the p3p1 interface block:
dns-nameservers 8.8.8.8 8.8.4.4 dns-search attlocal.net
To make the changes, use sudo ifdown p3p1 and sudo ifup p3p1 to take down and bring back the network interface and try the above diagnostics again.
Also, for some reason, I was able to connect to the internet upon rebooting, but after trying the ifdown/ifup commands above, I wasn't able to get to the internet anymore. But then I rebooted again, and now I'm able to connect to the internet even after ifdown/ifup.
Edit (1/3/16): Dr. Egan notes (possibly from reading this SO post) that the following command to restart the network service is equivalent to the ifdown/ifup commands:
$ sudo service network-manager restart
Setting up SSH remote connection (1/3/16)
I got the brilliant idea to set up a remote connection to the Ubuntu box so that I could continue working on the box despite not being physically able to access it. Dr. Egan suggested SSH, and the adventure began. First, I installed the OpenSSH server, which receives SSH connections from SSH clients (I installed PuTTY as my SSH client on my Windows laptop):
$ sudo apt-get install openssh-server
Then, according to suggestions from this Ubuntu help page, I backed up the sshd_config file to a read-only copy:
$ sudo cp /etc/ssh/sshd_config /etc/ssh/sshd_config.original $ sudo chmod a-w /etc/ssh/sshd_config.original
Now the real fun begins. I wanted to use SSH keys (specifically RSA keys) for authentication instead of password authentication (as suggested by this other Ubuntu help page), but I needed a way to copy the public RSA key on my laptop (the SSH client) onto the Ubuntu box (the SSH server). I basically decided to strip all forms of authentication off of the SSH connection by editing sshd_config and then restarting the SSH service to apply the changes:
$ sudo vi /etc/ssh/sshd_config $ sudo service ssh restart
The sshd_config man page helped a lot (especially in noting which options were on or off by default), but I basically disabled password authentication, RSA authentication, and pubkey authentication. Then, with my laptop connected to the same network as the Ubuntu box, I opened the SSH connection and copied my key into the authorized keys list (I had to make a new authorized_keys file):
$ vi ~/.ssh/authorized_keys $ [copy my public key to ~/.ssh/ajiang_rsa.pub] $ cat ~/.ssh/ajiang_rsa.pub >> ~/.ssh/authorized_keys $ rm ~/.ssh/ajiang_rsa.pub
Then I went back to sshd_config and enabled RSA and pubkey authentication, kept password authentication off, allowed TCP and X11 forwarding, set the port to 23 (according to Dr. Egan's suggestion), and explicitly specified the authorized keys file (though the default would have worked too), restarting the SSH service again to apply the changes.
$ sudo vi /etc/ssh/sshd_config $ sudo service ssh restart
I checked that the sshd service was running and which ports it was listening to with:
$ ps -A | grep sshd $ sudo ss -lnp | grep sshd
Now I had to configure my network's router firewall to allow port forwarding from outside the network (aka a pinhole). I fixed the IP address assigned to the box to a single IP address that I knew would work, and then I went to the port forwarding configuration page to allow TCP port forwarding on port 23 to the Ubuntu box on port 23. The router gave me a public IP address, and I used that in my PuTTY client (along with my private key and port 23) to try a SSH connection, and it worked!
Installing Mediawiki (1/4/16)
Mostly going to be following steps from this page on installing Mediawiki.
Make a directory for the stable version of Mediawiki (1.26.2), which isn't available through apt-get, so we're downloading the official tarball!
$ mkdir ~/Downloads $ cd ~/Downloads $ wget https://releases.wikimedia.org/mediawiki/1.26/mediawiki-1.26.2.tar.gz $ tar -xvzf /pathtofile/mediawiki-*.tar.gz
Copy the extracted files to /var/lib/mediawiki:
$ sudo mkdir /var/lib/mediawiki $ sudo mv mediawiki-1.26.2/* /var/lib/mediawiki
Then set up the mediawiki directory:
$ cd /var/www/html $ sudo ln -s /var/lib/mediawiki mediawiki
Mediawiki Security (1/15/16)
Mediawiki advises against implemented security measures because, if you're trying to make a publicly-editable wiki, you should need any user access restrictions at all (but you'll need to combat spam). We do want to make some pages publicly-editable (aka community-maintained) in the future, but much of the content should only be edited by us or a specified group of registered users. In addition, some pages should not even be viewable by unregistered or even registered users, whether that's on search results or through internal or external links.
The old webserver uses an extension called SimpleSecurity, but it's no longer maintained and has some known security issues (including allowing users to be able to see the titles of pages for which they do not have read access). These issues may be fixed by another extension, RemoveProtectedContent, but it doesn't seem like the best option.
Installing IntraACL (1/20/16)
I looked over some of the information on Mediawiki authorization extensions, including common security issues that many of the extensions have trouble fixing and a table listing several of the more popular authorization extensions and what features each supports, and a new extension, IntraACL seems to offer the most features and is the most recently maintained.
I followed the installation instructions for IntraACL pretty much line-for-line, including the patch.
Understanding IntraACL (1/22/16)
So I must have set something during the installation or configuration of Mediawiki that revoked the 'edit' page permission from all users (even anonymous users), which I think applies before the IntraACL. The line in LocalSettings.php looks like:
$wgGroupPermissions['*']['edit'] = false;
and I commented it out so that the IntraACL can manage permissions. I also found a special page in Mediawiki that lists the user group rights. Mediawiki has a page listing the different user rights and user groups.
The thing is, IntraACL has its own system of groups. So we can either focus entirely on IntraACL groups or try to apply broad changes with Mediawiki groups and LocalSettings.php $wgGroupPermissions. I think we should try going with IntraACL's groups...
Finding Holes in IntraACL (1/22/16)
Going through the issues on this Mediawiki page
- Inclusion/Transclusion
- Preloading
- XML Export (Special:Export)
- Atom/RSS Feeds
- Listings & Search
Pages that can't be read still have their titles show up in search auto-complete, but not in search results. Can disable search-box autocomplete, as shown on this Mediawiki page
- Diff & Revision Links
- API
- Action Links
- Related Rights
- Author Backdoor
- Caching
- Files & Images
- Redirects
- Edit Section
- Watching Pages
- Other Extensions
Creating Users (1/25/16)
Assuming that you have root, you can create user accounts and give them root too. The process is:
First create the users group, checking the last group number:
$ cat /etc/group $ /usr/sbin/groupadd -g 515 username
Then add the user
$ /usr/sbin/useradd -g username -G root -s /bin/bash -p xxxx -d /home/username -m username where g is the primary group, G is other groups, p sets a password, d declares a home directory and m makes the directory
Change the user's password:
$ passwd username
And add the user to the sudoers file
$ echo 'username ALL=(ALL) ALL' >> /etc/sudoers
To delete a user:
$ /usr/sbin/userdel -r roger where r removes the home directory
And to remove their group
$ /usr/sbin/groupdel username
Labeled Section Transclusion (1/25/16)
Ed wanted me to look into Labeled Section Transclusions, which would allow two different sections of a single page to be transcluded onto different locations. There's an extension for that, but the extension download is only compatible with Mediawiki version 1.19 or newer, and our current webserver is on Mediawiki 1.13.3 (can check Mediawiki version on the Special:Version page). But I got the LST extension on the test webserver and successfully transcluded different sections. Installing the extension is pretty straightforward: wget the tarball link, untar it, and copy to the /var/lib/mediawiki/extensions directory:
$ cd ~/Downloads $ wget https://extdist.wmflabs.org/dist/extensions/LabeledSectionTransclusion-REL1_26-60037a2.tar.gz $ tar -xzvf LabeledSectionTransclusion-REL1_26-60037a2.tar.gz $ cd /var/lib/mediawiki/extensions $ cp -r ~/Downloads/LabeledSectionTransclusion ./LabeledSectionTransclusion
Then add this line to LocalSettings.php:
require_once("$IP/extensions/LabeledSectionTransclusion/LabeledSectionTransclusion.php");
This Wikipedia page documents a method for selective transclusion that doesn't require the extension, but I was unable to replicate the results on the test webserver, so I assume that it requires some package or code that is specific to Wikipedia.
Responsive Design (1/25/16)
According to this mediawiki page, the extension that makes Wikipedia look pretty on mobile/tablet is called MobileFrontend (and it requires another extension called Mantle for Mediawiki versions 1.24 or older). Since the extension only serves up downloads for MW 1.19 or newer, I have to test this on the test webserver (MW 1.26).
As with the LST extension, the installation of the extension is pretty standard stuff: wget the tarball, untar it, and copy it to the extensions directory:
$ cd ~/Downloads $ wget https://extdist.wmflabs.org/dist/extensions/MobileFrontend-REL1_26-187dae8.tar.gz $ tar -xzvf MobileFrontend-REL1_26-187dae8.tar.gz $ cd /var/lib/mediawiki/extensions $ cp -r ~/Downloads/MobileFrontend ./MobileFrontend
Then, per the installation instructions on the extension page, add these two lines to LocalSettings.php:
require_once("$IP/extensions/MobileFrontend/MobileFrontend.php"); $wgMFAutodetectMobileView = true;
Short URLs (1/27/16)
The goal is to get the URLs to look like www.mcnaircenter.org/wiki/Page_Title. Mediawiki has a page for configuring short URLs on Apache.
First, enable the mod_rewrite Apache module (if it isn't enabled already). In Apache 2.4.7, this can be done from the command line:
$ sudo a2enmod rewrite
You can also check that mod_rewrite has been enabled by creating a phpinfo page and looking for mod_rewrite under the apache "Loaded Modules" section as described in this Stack Overflow answer.
As described on the Mediawiki manual page (and also by the web utility that the manual page recommends), add these lines to the end of the VirtualHost block in the Apache configuration file that defines the wiki's DocumentRoot (I added them to /etc/apache2/sites-available/000-default.conf): (You may also want to make a backup of the original configuration file in case you want to revert)
## http://www.mediawiki.org/wiki/Manual:Short_URL/Apache # Enable the rewrite engine RewriteEngine On # Short url for wiki pages RewriteRule ^/?wiki(/.*)?$ %{DOCUMENT_ROOT}/mediawiki/index.php [L] # Redirect / to Main Page RewriteRule ^/*$ %{DOCUMENT_ROOT}/mediawiki/index.php [L]
Then restart Apache:
$ sudo service apache2 restart
Find the line in LocalSettings.php that sets the $wgScriptPath and add these two lines below that line:
$wgScriptExtension = ".php"; $wgArticlePath = "/wiki/$1";
Infoboxes (2/1/16)
Here's the Mediawiki page about importing infoboxes from Wikipedia (here's a list of infobox templates and a help page on infoboxes and another help page on designing infoboxes from Wikipedia).
Installation procedure begins with installing the Scribunto extension:
$ cd ~/Downloads $ wget https://extdist.wmflabs.org/dist/extensions/Scribunto-REL1_26-4d4766f.tar.gz $ tar -xzvf Scribunto-REL1_26-4d4766f.tar.gz $ cd /var/lib/mediawiki/extensions $ cp -r ~/Downloads/Scribunto ./Scribunto
Then, per the installation instructions on the extension page, add these two lines to LocalSettings.php:
require_once "$IP/extensions/Scribunto/Scribunto.php"; $wgScribuntoDefaultEngine = 'luastandalone';
The ParserFunctions extension comes with Mediawiki 1.18 or newer, but you still need to load the extension in LocalSettings.php:
wfLoadExtension( 'ParserFunctions' );
Finally, give all users execute permissions on the Lua binaries that come with the extension (note that you should choose the correct OS in the binaries directory; you can use the uname -m command to determine whether the Linux kernel is 64- or 32-bit)
$ sudo chmod a+x /var/lib/mediawiki/extensions/Scribunto/engines/LuaStandalone/binaries/lua5_1_5_linux_64_generic/lua
And check if SELinux is enabled with sestatus -v. If it is enabled, set the type:
$ chcon -t httpd_sys_script_exec_t /var/lib/mediawiki/extensions/Scribunto/engines/LuaStandalone/binaries/lua5_1_5_linux_64_generic/lua
Go to Wikipedia's Special:Export page and list the infobox templates that you want to export. For example, I chose to export the following templates, and checked the options "Include only the current revision, not the full history", "Include templates", and "Save as file":
Template:Infobox Template:Infobox/doc Template:Infobox person Template:Infobox person/doc Template:Infobox company Template:Infobox company/doc
Then go to the test wiki's Special:Import page and choose the XML file that you got from the step above. I chose to "Import to default locations".
The import threw a 500 Server Error, but when I checked the Special:AllPages list of all pages in the Template namespace, the entries for the Template:Infobox, etc. pages show up but when I try to see them, I get another 500 Server Error.
Debugging Special:Import Errors (2/1/16 - 2/5/16)
I already had created a phpinfo page for debugging as described in this Stack Overflow answer. (Note that $ php -i | grep 'what_you_seek' also works, but the command line interface or CLI may use a different configuration file.)
Note that the phpinfo page shows the location of the Apache error logs (look for APACHE_LOG_DIR) and the location of the php.ini configuration file (look for Configuration File).
Checking the Apache error logs reveals that the Import failed with this error:
PHP Fatal Error: Call to undefined function pcntl_wifsignaled() in /var/lib/mediawiki/extensions/Scribunto/engines/LuaStandalone/LuaStandaloneEngine.php on line 645, referer: http://128.42.44.22/wiki/Special:Import
Looking for pcntl_wifsignaled in the phpinfo page reveals that it is disabled. I check that pcntl is actually installed:
$ php -m | grep pcntl
Then I try to comment out the line in the php.ini configuration file that disables the pcntl functions:
;disable_functions = pcntl_alarm,pcntl_fork,pcntl_waitpid,pcntl_wait,pcntl_wifexited,pcntl_wifstopped,pcntl_wifsignaled,pcntl_wexitstatus,pcntl_wtermsig,pcntl_wstopsig,pcntl_signal,pcntl_signal_dispatch,pcntl_get_last_error,pcntl_strerror,pcntl_sigprocmask,pcntl_sigwaitinfo,pcntl_sigtimedwait,pcntl_exec,pcntl_getpriority,pcntl_setpriority,
But that also doesn't work (Import gives 500 error again and I still can't view the page from the Special:AllPages list of all pages in the Template namespace due to another 500 error)
Day 2:
I try adding the following line to the php.ini configuration file under the "Dynamic Extensions" section to load the pcntl extension:
extension=pcntl.so
The import still doesn't work (another 500 error). This SO answer gives another way to check that Process Control (aka pcntl) is enabled:
$ php --ri pcntl
And it is enabled. Okay so backtracking a little. Here's a post that mentions importing this stylesheet from Wikipedia (the source wiki, aka where we're getting our infobox templates from). I exported that page with the three options -- "Include only the current revision, not the full history", "Include templates", and "Save as file" -- all checked. Importing the resulting XML file was successful. But trying to import the infobox templates immediately afterwards still failed (500 error).
And here's a random tangent: an older version of the post above mentions that the XML file needs to be modified: all of the instances of "text/plain" need to be replaced with "CONTENT_TEXT_FORMAT". There's a thread on Mediawiki that suggests the same fix (the suggestion is also about 3 years old, so it may be just referring to the old version of the post without citing it). The thing that makes me uncertain is that Scribunto's ticket trackers have some tickets about this suggested fix: this one and this one came up when I searched for "content_text_format scribunto". Both seem to conclusively suggest that the fix isn't necessary in newer versions of Scribunto and Mediawiki, and there's even a backport left there for people who are still trying the fix. So it seems that the fix won't actually change anything.
Well, since I imported the Common.css stylesheet, and the mediawiki infobox tutorial suggests importing a javascript file as well, might as well try doing that. I exported the javascript from Wikipedia with the three options -- "Include only the current revision, not the full history", "Include templates", and "Save as file" -- all checked. Importing that XML is also successful.
Importing the infobox templates still fails (500 error). Back to the Apache error logs... *sigh*
I'm dealing with the same 'undefined function pcntl_wifsignaled()' error that I faced last time, so I didn't make any progress.
Well, there's a few concerning things.
- I'm running all of my diagnostic commands (like $ php -m | grep pcntl or $ php --ri pcntl) on the command line, which uses a different php.ini configuration file. Which may have no correlation to the apache2 php.ini configuration file (though they appear to be almost identical).
- The undefined function pcntl_wifsignaled() is called by a function in LuaStandAlone.php called handleIOError(), which means that the pcntl_wifsignaled() error may be just a side effect of another error.
But wait, there's an error in the apache error log about failing to load pcntl.so:
PHP Warning: PHP Startup: Unable to load dynamic library '/usr/lib/php5/20121212/pcntl.so' - /usr/lib/php5/20121212/pcntl.so: cannot open shared object file: No such file or directory in Unknown on line 0
So adding the "extension=pcntl.so" line didn't really help (as I figured). I need to get the pcntl.so file. PHP docs for Process Control say you need a special option when compiling PHP to enable pcntl.
$ cd ~/Downloads $ wget http://php.net/get/php-5.5.9.tar.gz/from/this/mirror $ tar -zxvf php-5.5.9.tar.gz $ cd php-5.5.9/ext/pcntl $ sudo apt-get install php5-dev $ phpize $ ./configure $ make $ sudo cp pcntl.so /usr/lib/php5/20121212
A Miraculous Fix (2/8/16)
So I loaded the Template:Infobox/doc page on the test wiki this afternoon, and it miraculously loaded! It still has a Lua script error, but at least it's not just 500 server errors all the way. The Lua script error seems to be a timeout error, and from some google searching, it seems that the default timeout length is 10 seconds for Lua, whereas markup-based templates have a 60 second timeout limit.
Importing the XML file with the Infobox templates now works as well. Really not sure how this worked...
Configuring Image Uploads (2/8/16)
Might as well try to make some infoboxes. But I need to configure file uploads. There's a Mediawiki page that I followed pretty closely.
First, I went to the php.ini configuration file and checked that file_uploads was set to On and noted that open_basedir isn't set.
Then I set the permissions for the images directory to 755 with:
$ sudo chmod -R 755 /var/lib/mediawiki/images
And I also added these lines to the apache2.conf configuration file:
<Directory /var/www/wiki/images> Options -Indexes </Directory>
Then I set the $wgEnableUploads options to true in LocalSettings.php.
When I try to upload a file, however, I get an exception.
I added this line to LocalSettings.php to print a backtrace:
$wgShowExceptionDetails = true;
And the resulting backtrace/error message:
[b9bfa0ee] /wiki/Special:Upload MWException from line 1873 of /var/lib/mediawiki/includes/filerepo/file/LocalFile.php: Could not acquire lock for 'File-Donald_August_19_(cropped).jpg.' Backtrace: #0 /var/lib/mediawiki/includes/filerepo/file/LocalFile.php(1152): LocalFile->lock() #1 /var/lib/mediawiki/includes/upload/UploadBase.php(708): LocalFile->upload(string, string, string, integer, array, boolean, User) #2 /var/lib/mediawiki/includes/specials/SpecialUpload.php(488): UploadBase->performUpload(string, string, boolean, User) #3 /var/lib/mediawiki/includes/specials/SpecialUpload.php(197): SpecialUpload->processUpload() #4 /var/lib/mediawiki/includes/specialpage/SpecialPage.php(384): SpecialUpload->execute(NULL) #5 /var/lib/mediawiki/includes/specialpage/SpecialPageFactory.php(553): SpecialPage->run(NULL) #6 /var/lib/mediawiki/includes/MediaWiki.php(281): SpecialPageFactory::executePath(Title, RequestContext) #7 /var/lib/mediawiki/includes/MediaWiki.php(714): MediaWiki->performRequest() #8 /var/lib/mediawiki/includes/MediaWiki.php(508): MediaWiki->main() #9 /var/lib/mediawiki/index.php(41): MediaWiki->run() #10 {main}
Google searches yield this thread and this thread dealing with this error message.
Installing Ghost (3/14/2016)
You need to make sure you have the correct version of node installed (should be a version of node that Ghost supports; at time of writing, it should be 0.10.x). Credits to this page for helping me out.
$ sudo apt-get install nodejs $ nodejs -v
Go ahead and try to remove the old version of node and then clean up any unused packages.
$ sudo apt-get remove --purge node $ sudo apt-get autoremove
Make a symbolic link from nodejs to node
$ sudo ln -s /usr/bin/nodejs /usr/bin/node
Install npm too and check the version:
$ sudo apt-get install npm $ npm -v
Now you can follow the instructions for installing Ghost!
$ curl -L https://ghost.org/zip/ghost-latest.zip -o ghost.zip $ sudo apt-get install unzip $ sudo mkdir /var/www/ghost $ unzip -uo ghost.zip -d /var/www/ghost $ cd /var/www/ghost $ sudo npm install $ sudo npm start
Note that we had to install the unzip package. I also chose to not install in the production environment so that I would have more debugging info and the ability to tinker with the theming.
Ghost on Apache? (3/14/2016)
So Ed would rather not serve the blog off of port 2368. Looks like there's documentation for setting Ghost up on nginx and apache. Since we would rather not move the mediawiki off of apache, let's just try doing Ghost and apache.
Turns out that doing so is a lot more hassle than I initially thought. Ghost and nginx may be easier, and maybe Ghost+apache isn't even that bad, but it's definitely more involved, especially when setting it up alongside another site (the mediawiki).
This page looked helpful, and this post has a complete tutorial if you want to go through with it, but it seems difficult to get Ghost set up correctly on Apache.
Installing WordPress (3/14/2016)
Following the Detailed Instructions to install WordPress was enough to get me started:
$ cd ~/Downloads $ wget https://wordpress.org/latest.tar.gz $ tar -xzvf latest.tar.gz
Configure a database for WordPress (can be called something other than wordpress) and make a new MySQL user (can be called something other than mcnair_wp) that has all permissions for the wordpress database. Obviously, you should replace a_secure_password with an actual password for the user (but leave the quotes around the password when typing the MySQL command). FLUSH PRIVILEGES reloads the permissions tables.
$ mysql -u root -p Enter password:
mysql> CREATE DATABASE wordpress; mysql> GRANT ALL PRIVILEGES ON wordpress.* TO "mcnair_wp"@"localhost" IDENTIFIED BY "password"; mysql> FLUSH PRIVILEGES; mysql> EXIT
You can verify that the wordpress database and user were created correctly by logging into the mysql client commandline interface using the new user:
$ mysql -u mcnair_wp -p Enter password: mysql> SHOW DATABASES; mysql> USE wordpress; mysql> EXIT
Make a wp-config.php by making a copy of the wp-config-sample.php file and renaming it:
$ cp ~/Downloads/wordpress/wp-config-sample.php ~/Downloads/wordpress/wp-config.php $ sudo vi ~/Downloads/wordpress/wp-config.php
Edit the lines that define the DB_NAME, DB_USER, and DB_PASSWORD constants to have the values that you used to set up the MySQL database and user above.
Copy the wordpress directory to /var/lib/wordpress and then make a symlink from /var/www/blog to /var/lib/wordpress (much like how the mediawiki was done) so that http://128.42.44.22/blog points to the WP blog:
$ sudo cp -r ~/Downloads/wordpress /var/lib/wordpress $ cd /var/www/html $ sudo ln -s /var/lib/wordpress blog
Navigate a browser to http://128.42.44.22/blog/wp-admin/install.php to complete the installation (you'll be asked to create an admin user for the WordPress site).
Installing Open Web Analytics (3/21/2016)
$ cd ~/Downloads $ git clone https://github.com/padams/Open-Web-Analytics.git $ cd /var/lib/mediawiki/extensions $ cp -r ~/Downloads/Open-Web-Analytics ./owa
edit LocalSettings.php and add the following line:
require_once('extensions/owa/mw_plugin.php');
Go to the list of Special Pages on the mediawiki and click on the Open Web Analytics special page to install OWA.
Long story short, this extension has only been tested up to Mediawiki version 1.16. I tried so hard, and got so far, but in the end, it doesn't even matter.
Installing Piwik (3/21/2016)
Installing Piwik itself (instructions from Piwik):
$ cd ~/Downloads $ wget http://builds.piwik.org/piwik.zip && unzip piwik.zip $ sudo cp -r ~/Downloads/piwik /var/lib/piwik $ cd /var/lib/piwik $ sudo chmod 777 tmp $ cd /var/www/html $ sudo ln -s /var/lib/piwik analytics
Navigate a browser to http://128.42.44.22/analytics and go through the Piwik installation. Make sure you fix everything on the "System Checks" page.
When you get to the Database Setup page, you'll need to configure a MySQL database for Piwik. If you followed the steps for configuring a WordPress database, the steps are almost identical.
Configure a database for Piwik (can be called something other than piwik) and make a new MySQL user (can be called something other than mcnair_piwik) that has all permissions for the piwik database. Obviously, you should replace a_secure_password with an actual password for the user (but leave the quotes around the password when typing the MySQL command). FLUSH PRIVILEGES reloads the permissions tables.
$ mysql -u root -p Enter password:
mysql> CREATE DATABASE piwik; mysql> GRANT ALL PRIVILEGES ON piwik.* TO "mcnair_piwik"@"localhost" IDENTIFIED BY "password"; mysql> FLUSH PRIVILEGES; mysql> EXIT
You can verify that the piwik database and user were created correctly by logging into the mysql client commandline interface using the new user:
$ mysql -u mcnair_piwik -p Enter password:
mysql> SHOW DATABASES; mysql> USE piwik; mysql> EXIT
Installing the Piwik Integration extension for Mediawiki:
$ cd ~/Downloads $ wget https://github.com/DaSchTour/piwik-mediawiki-extension/archive/master.zip $ unzip -uo master.zip -d /var/lib/mediawiki/extensions $ cd /var/lib/mediawiki/extensions $ mv piwik-mediawiki-extension-master/ Piwik/
edit LocalSettings.php to add these lines:
require_once("$IP/extensions/Piwik/Piwik.php"); $wgPiwikURL = "128.42.44.22/analytics/"; $wgPiwikIDSite = "1";
But it doesn't seem to register the visits...
Turns out Piwik by default honors DoNotTrack (as I learned here), so my browser wouldn't register as a visit. So there's visits now. Yay!
Also, to make the little graphs next to the numbers not be broken, you have to get the most recent version of the GD module for PHP:
$ sudo apt-get install php5-gd $ sudo service apache2 restart
Installing Google Analytics (3/23/2016)
$ cd ~/Downloads $ wget https://extdist.wmflabs.org/dist/extensions/googleAnalytics-REL1_26-d832801.tar.gz $ tar -xzvf googleAnalytics-REL1_26-d832801.tar.gz $ cd /var/lib/mediawiki/extensions $ cp -r ~/Downloads/googleAnalytics ./GoogleAnalytics
Add these lines to LocalSettings.php:
require_once("$IP/extensions/GoogleAnalytics/googleAnalytics.php"); // Replace xxxxxxx-x with YOUR GoogleAnalytics UA number $wgGoogleAnalyticsAccount = 'UA-xxxxxxx-x';
Installing Semantic Mediawiki (3/25/2016)
The installation process looks complicated, but let's be careful. First up, we're going to install Composer globally.
$ cd ~/Downloads
and then follow the download instructions here to install Composer to the ~/Downloads directory. Then move the composer.phar executable to a directory in our path:
$ sudo mv ~/Downloads/composer.phar /usr/local/bin/composer $ composer --version $ composer --list
Now you can just call "composer" instead of doing "php /path/to/composer/composer.phar"
Proceeding with the install for SMW...
$ cd /var/lib/mediawiki $ sudo composer require mediawiki/semantic-media-wiki "~2.3" --update-no-dev
Make sure to replace "~2.3" with the appropriate latest release version (ignoring the third number, e.g. if the latest release is 2.3.1, then use "~2.3").
$ php maintenance/update.php $ sudo vi LocalSettings.php
And add this line to the bottom of LocalSettings.php:
enableSemantics('domain_name.com');
Check to see if the mediawiki site recognizes that the extension has been installed by visiting the Special:Version page.
You can test that the SMW annotations are working by following the instructions on this page: https://www.semantic-mediawiki.org/wiki/Help:Testing
Installing Semantic Forms (3/25/2016)
Don't install from Mediawiki's Extension Distributor (according to the documentation)! Instead, get from the Git repository:
$ cd ~/Downloads $ git clone https://git.wikimedia.org/git/mediawiki/extensions/SemanticForms.git $ cp -r ./SemanticForms /var/lib/mediawiki/extensions/SemanticForms $ cd /var/lib/mediawiki $ sudo vi LocalSettings.php
Add the following line to LocalSettings.php:
include_once("$IP/extensions/SemanticForms/SemanticForms.php");
Semantic Forms Examples (3/28/2016)
There's an example data structure on the Semantic Forms page. I followed most of the steps, except for Enabling links to forms. I couldn't get #formredlink to work properly (the template wasn't parsing the silent property declaration with #set properly), so I instead added a line to the "Was written by" property page:
[[Creates pages with form::Author]]
And this way, once a new Book page is created, the redlink for the book's author (when clicked) automatically generates an Author page.
I also added #default_form lines at the end of the Book and Author template pages, e.g. for the Author template page, the last few lines looked like:
... [[Category:Authors]] {{#default_form:Author}} </includeonly>
and I did something similar for the Book template page. This way, every Book and Author page will have a "Edit with form" tab in addition to the "Edit" tab (and the "edit with form" tab is significantly more useful).
Installing Cargo (3/28/2016)
$ cd ~/Downloads $ git clone https://git.wikimedia.org/git/mediawiki/extensions/Cargo.git $ cp -r ./Cargo /var/lib/mediawiki/extensions/Cargo $ cd /var/lib/mediawiki $ sudo vi LocalSettings.php
Add the following line to the LocalSettings.php configuration file:
require_once( "$IP/extensions/Cargo/Cargo.php" );
Then back to the console to do some PHP updating:
$ php maintenance/update.php
Cargo Examples (3/28/2016)
First, remove Semantic Mediawiki by going to the composer.json file in the Mediawiki root directory and deleting the line that requires the semantic mediawiki package. then run sudo composer update
from the Mediawiki root directory.
Create templates using the special page under the Semantic Forms category. Then to create the data tables, you go to each template page and choose "Create data table" from the dropdown next to the edit tab. After creating the data table once, it seems that you can create the data from the command line (if you do this before creating the data tables, nothing happens...):
$ cd /var/lib/mediawiki/extensions/Cargo/maintenance $ php cargoRecreateData.php
You'll have to edit the template pages to add queries, but at that point, you may just want to write the templates yourself.
As with Semantic Mediawiki + Semantic Forms, you can add the #default_form parser function to the template page to display an "edit with form" tab alongside the "edit" tab (you'll likely have to refresh the page to see the changes).
Cargo Data structuring (3/30/2016)
Sahil and I came up with a SQL database schema for the organizations and events in the startup ecosystem. Organization subtypes are: startups, VC funds, accelerators, incubators, service providers. Event subtypes are: financing, training, liquidity. Each subtype has fields specific to it, but all organizations need to have a name, logo, URL, address, founding date, and status, and all events need to have a date and need to include which organizations are involved.
We tried doing foreign keys, but you can't do that with Cargo, so maybe we should look into other options. One easy way out would be to just duplicate the columns that are common to all organizations in each organization subtype's table. but this seems like bad practice. I found #cargo_attach which may help us in this sort of situation.
There is an issue with using #cargo_query
in the test wiki. When a new page is created that is related to a page that is already in existence, the page that is already in existence should update automatically to display that it is related to the new page. Instead, the old page won't display its relation to the new page until someone goes into the old page, hits the edit button, and saves an edit to the page. Even if there is no change in the page text from the previous version, the old page will now display it's relation to the new page. For example on the page Bolt, you can see it is directed by Byron Howard, and on the page Byron Howard, you can see that Bolt is one of his films. This is working as intended; however, on the page Tangled it shows that Tangled is also directed by Byron Howard. On Byron Howard's page, it did not show that Tangled was one of his films until I edited the page. It then updated to show Tangled as one of his films.
It seems like the above problem is called because when #cargo_query
is used on a page, the query is not called each time the page is refreshed, but rather only when an edit is saved to the page and then saved from then on. This could cause display issues with pages for accelerator's in the future not displaying new companies added.
Back to Semantic Mediawiki (4/8/2016)
Semantic Mediawiki might be better for the inheritance (i.e. foreign keys in SQL). In SMW, we can define the properties and templates for a superclass and subclass. If we make a form that includes both templates, the form creates pages that has the templates for the superclass and subclass included, and the properties are all there. This might be the solution we're looking for, but it seems to be more difficult to query, since there is no explicit distinction between the attributes common to all subclasses and the attributes specific to a single subclass.
Ed likes it so far. We should just move forward with the actual data structure on SMW until major roadblocks prevent further progress.
Some notes on why SMW seems to be more flexible than Cargo: the properties in SMW are responsible for storing data, whereas the templates in Cargo are responsible for storing data (using #cargo_store calls). This means that a Semantic Form using SMW properties that includes multiple templates is okay, whereas an Semantic Form using Cargo tables needs to ensure that the templates are each affecting different tables, which isn't the case for inheritance.
To-do list
- Google Analytics on the test blog. we need FTP access (port 21) to be able to install new plugins, apparently...