Install TeamForge 8.1 with Database and SCM on separate servers

In this option, we install the Database (Operational Database) and Datamart (Reporting Database) on the same server; SCM (Subversion and CVS) and Git on the second server, and other services on the application server.

In this option, the following services run on the application server (we call this my.app.host). The following service runs on the database server. (we call this my.db.host).
The following services run on the SCM server. (we call this my.scmandgit.host).
  • SCM Integration Server (Subversion and CVS)
  • GIT Integration Server
Note: In a multi-server installation of TeamForge, ensure that all servers have the same system time zone for ETL to function properly.

Log on to the server as root user always.

Do this on the main TeamForge application server. We'll call this my.app.host.

  1. Install SuSE Linux Enterprise Server 11 SP2 and log in as root.
    Important: Don't customize your installation. Select only the default packages list.
  2. Check your basic networking setup. See Set up networking for your TeamForge server for details.
  3. Configure your TeamForge 8.1 installation repository. See TeamForge installation repository configuration for SUSE.
  4. Install the following application packages.
    1. TeamForge: To install the TeamForge application packages run the following command:
      • zypper install teamforge
    2. To install Black Duck Code Sight run the following command.
      • zypper install teamforge-codesearch
  5. Set up your site's master configuration file.
    • vi /opt/collabnet/teamforge-installer/8.1.0.1/conf/site-options.conf
    1. Identify the servers and services running on them.
      HOST_localhost=app etl indexer
      DOMAIN_localhost=my.app.domain.com
      HOST_my.db.domain.com=database datamart
      HOST_my.scmandgit.domain.com=subversion cvs gerrit  
    2. Add 'codesearch' to the HOST_localhost token if you are installing Black Duck Code Sight.
      HOST_localhost=app etl indexer codesearch
    3. Configure the database and datamart settings.
      Note: For more information about configuring variables, see site-options.conf
      DATABASE_TYPE=postgresql
      DATABASE_USERNAME=ctfuser
      DATABASE_NAME=ctfdb
      DATABASE_READ_ONLY_USER=ctfrouser
      REPORTS_DATABASE_USERNAME=ctfrptuser
      REPORTS_DATABASE_NAME=ctfrptdb
      REPORTS_DATABASE_READ_ONLY_USER=ctfrptrouser
      REPORTS_DATABASE_MAX_POOL_SIZE=30
      Note: The database name and username values are arbitrary alphanumeric strings.
    4. TeamForge 7.1 and later support automatic password creation. See AUTO_DATA for more information.
    5. Password obfuscation

      The password obfuscation is enabled by default. As a result, all password-related tokens are encrypted in all the TeamForge configuration files.

      Restriction: The password-related tokens cannot contain the following characters in the site-options.conf file: $<>/\'"`
      • To disable password obfuscation, set OBFUSCATION_ENABLED=false.
      • To configure the obfuscation key, set OBFUSCATION_KEY=<Any AlphaNumeric value with length >= 8 bytes>. The default value of OBFUSCATION_KEY token is XSJt43wN.
    6. Turn on the SSL for your site by editing the relevant variables in the site-options.conf file. To generate the SSL certificates, see Generate SSL certificates.
      • SSL=on
      • SSL_CERT_FILE
      • SSL_KEY_FILE
      • SSL_CHAIN_FILE
      Note: The SSL_CERT_FILE and SSL_KEY_FILE tokens need an absolute path. The SSL_CHAIN_FILE token is optional.
    7. If the token REQUIRE_PASSWORD_SECURITY is enabled, then set a value for the token, PASSWORD_CONTROL_EFFECTIVE_DATE.
      CAUTION:
      The Password Control Kit (PCK) disables, deletes or expires user accounts that don't meet the password security requirements starting from the date set for the PASSWORD_CONTROL_EFFECTIVE_DATE token. If a date is not set, the PCK disables, deletes or expires user accounts immediately. See PASSWORD_CONTROL_EFFECTIVE_DATE for more information.
    8. Include the SCM_DEFAULT_SHARED_SECRET token in the site-options.conf file of the primary TeamForge server and provide it with a value of 16-24 characters. Remember to use the same key in the external SCM integration server also.
    9. If the token REQUIRE_RANDOM_ADMIN_PASSWORD is already set to true, then set the token ADMIN_EMAIL with a valid email address. ADMIN_EMAIL=root@{__APPLICATION_HOST__}
    10. If you have LDAP set up for external authentication, you must set the “REQUIRE_USER_PASSWORD_CHANGE” site options token to false.
    11. Ensure to set the token DEDICATED_INSTALL=true. This makes the installation process very simple as the TeamForge installer takes care of configuring the Apache and PostgreSQL automatically.
    12. Make sure that the following tokens have a value if ETL is enabled.
      SOAP_ANONYMOUS_SHARED_SECRET
      ETL_SOAP_SHARED_SECRET
    13. Configure Black Duck Code Sight tokens if you are installing Black Duck Code Sight. See Black Duck Code Sight site-option tokens.
    14. To enable the history protection feature of TeamForge Git integration, set the GERRIT_FORCE_HISTORY_PROTECTION=true. For more information, see GERRIT_FORCE_HISTORY_PROTECTION.
    15. Make sure the PostgreSQL tokens in the site-options.conf file are set as recommended in the following topic: What are the right PostgreSQL settings for my site?
    16. Important: This step is required if you want Git notification emails.
      Update the JAMES_ACCEPTED_RELAYS site-options token with the Git server's IP address. See JAMES_ACCEPTED_RELAYS for more information.
      JAMES_ACCEPTED_RELAYS=127.0.0.1,{__CEE_DOMAIN__},<The IP address of the Git server>
    17. Save the site-options.conf file.
  6. Run the install_jdk_suse.sh script.
    • cd /opt/collabnet/teamforge-installer/8.1.0.1
    • ./install_jdk_suse.sh
  7. Recreate the runtime environment.
    • cd /opt/collabnet/teamforge-installer/8.1.0.1
    • ./install.sh -r -I -V
  8. Important: Do this if you have updated the JAMES_ACCEPTED_RELAYS token with the Git server's IP address.
    Edit the /opt/collabnet/teamforge/runtime/james/apps/james/SAR-INF/config.xml file and comment out the <authorizedAddresses> node. For example:
    <!--  <authorizedAddresses>127.0.0.0/8</authorizedAddresses>   -->

Do this on the database server - my.db.host

  1. Install SuSE Linux Enterprise Server 11 SP2 and log in as root.
    Important: Don't customize your installation. Select only the default packages list.
  2. Check your basic networking setup. See Set up networking for your TeamForge server for details.
  3. Configure your TeamForge 8.1 installation repository. See TeamForge installation repository configuration for SUSE.
  4. Install the TeamForge database packages.
    • zypper install teamforge-database
  5. Copy the site-options.conf file from the application server to the database server in the directory /opt/collabnet/teamforge-installer/8.1.0.1/conf
  6. Modify the host token settings on the site-options.conf file.
    Important: If you choose not to use the application server's site-options.conf file, then don't forget to copy the value of AUTO_DATA token from the application server.
    HOST_my.db.host=database datamart
    Note: 'HOST_my.db.host' is just an example. As you are installing database on a separate server, do not use 'HOST_localhost'. Use 'HOST_<valid host name>' instead.
    DOMAIN_my.db.host=my.db.domain.com
    HOST_my.app.domain.com=app etl indexer codesearch
    HOST_my.scmandgit.domain.com=subversion cvs gerrit
  7. Run the install_jdk_suse.sh script.
    • cd /opt/collabnet/teamforge-installer/8.1.0.1
    • ./install_jdk_suse.sh
  8. Recreate the runtime environment.
    • cd /opt/collabnet/teamforge-installer/8.1.0.1
    • ./install.sh -r -I -V

Do this on the SCM Server - my.scmandgit.host

  1. Install SuSE Linux Enterprise Server 11 SP2 and log in as root.
    Important: Don't customize your installation. Select only the default packages list.
  2. Check your basic networking setup. See Set up networking for your TeamForge server for details.
  3. Configure your TeamForge 8.1 installation repository. See TeamForge installation repository configuration for SUSE.
  4. Install the TeamForge SCM and Git packages.
    • zypper install teamforge-scm teamforge-git
  5. Copy the site-options.conf file from the application server to the SCM server in the directory /opt/collabnet/teamforge-installer/8.1.0.1/conf
  6. Modify the host token settings on the site-options.conf file.
    Important: If you choose not to use the application server's site-options.conf file, then don't forget to copy the value of AUTO_DATA token from the application server.
    HOST_my.scmandgit.host=subversion cvs gerrit
    Note: 'HOST_my.scmandgit.host' is just an example. As you are installing SCM on a separate server, do not use 'HOST_localhost'. Use 'HOST_<valid host name>' instead.
    DOMAIN_my.scmandgit.host=my.scmandgit.domain.com
    HOST_my.app.domain.com=app etl indexer codesearch
    HOST_my.db.domain.com=database datamart
  7. Run the install_jdk_suse.sh script.
    • cd /opt/collabnet/teamforge-installer/8.1.0.1
    • ./install_jdk_suse.sh
  8. Recreate the runtime environment.
    • cd /opt/collabnet/teamforge-installer/8.1.0.1
    • ./install.sh -r -I -V
  9. Set up the initial site data (bootstrap).
    • cd /opt/collabnet/teamforge-installer/8.1.0.1
    • ./bootstrap-data.sh

Do the following on the application server - my.app.host

  1. Set up the initial site data (bootstrap).
    • cd /opt/collabnet/teamforge-installer/8.1.0.1
    • ./bootstrap-data.sh
  2. Start TeamForge.
    • /etc/init.d/collabnet start
  3. Note: If the token REQUIRE_USER_PASSWORD_CHANGE is set to true, login to TeamForge user interface, change the admin password and then run the post-install.py script.
    Run the TeamForge post installation script. For more information, see post-install.py.
    • /opt/collabnet/teamforge/runtime/scripts/post-install.py
    Note: In case you face any GIT-related issues while running the post-install.py script, see Post install fails for GIT. What should I do?.
  4. If you have installed Black Duck Code Sight, then install the license for Black Duck Code Sight. For more information, see Install the Black Duck Code Sight license.
  5. Run the svn_cache.sh script.
    • cd /opt/collabnet/teamforge/runtime/scripts/codesearch/
    • ./svn_cache.sh <Repository Base URL Path of the SCM Integration Server>

    Provide a repository base URL path of the SCM integration server, for example, "http://myint.box.net/svn/repos", where myint.box is the server with the SCM integration server.

    In addition, if you add a new integration server at some point later, you must run this svn_cache.sh script, (after creating the new integration server), on the TeamForge application server.

Do this on the SCM server - my.scmandgit.host

  1. Run the TeamForge post installation script. For more information, see post-install.py.
    • /opt/collabnet/teamforge/runtime/scripts/post-install.py
  2. Important: This step is required if you want Git notification emails.
    Edit the /opt/collabnet/gerrit/etc/gerrit.config file and update the 'smtpServer' property with the TeamForge application server's host name. For example:
    [sendemail]
                  smtpServer = <TeamForge application server host name>
    1. Restart gerrit.
      • /etc/init.d/collabnet stop gerrit
      • /etc/init.d/collabnet start gerrit

Do the following on the application server - my.app.host

  1. Revoke the super user permissions of database and datamart users.
    • /opt/collabnet/teamforge/runtime/scripts/revoke-superuser-permission.py
  2. Restart the collabnet services.
    • /etc/init.d/collabnet restart
  3. Apply some finishing touches and make sure everything is running smoothly.
    1. Reboot the server and make sure all services come up automatically at startup.
    2. Log into your site as the administrator. The value of the DOMAIN variable in the site-options.conf file is the URL to log into.
    3. Create a sample project. See Create a TeamForge project.
    4. Write a welcome message to your site's users. See Create a site-wide broadcast.