TeamForge - TestLink integration: general usage FAQ

Here are some FAQs on TeamForge - TestLink integration

Does the TestLink integration work with LDAP?
Yes, TestLink works with LDAP and no additional setup is required for this. TeamForge is integrated with LDAP and the user can seamlessly navigate to TestLink after single sign-on.
What is the maximum number of test cases a TestLink project can have?
You can create as many test cases as you want for a TestLink project.

This was tested with a sample TestLink project, which had as many as 1000 test suites with 100 test cases attached to each of the test suites. So the total number of test cases in the project was 1000 x 100 = 100,000 test cases. These 100,000 test cases were easily searchable and navigable without any performance degradation.

What is the licensing mechanism for the integration?
The TestLink integration complies with GPLv3 as the TestLink product is GPLv3 licensed.
Does CollabNet support the TestLink product?
TestLink is an open source product and CollabNet supports only TestLink integration with TeamForge and not the issues pertaining to the TestLink product. If there are issues with the TestLink core product, then they have to be reported to the TestLink forums.
Does CollabNet support TestLink, Jenkins and Selenium loop?
The TestLink, Jenkins and Selenium architecture is a reference used to demonstrate what value you can bring in when you integrate TestLink with Jenkins. This is only for your reference. You can pick any automation tool along with Jenkins to integrate with TestLink. CollabNet supports only the integration built for TestLink.
Can we install TestLink on the same box as CollabNet TeamForge?
Yes, that is possible. TestLink can be installed in the same box as CollabNet TeamForge. The capacity planning has to be done by looking at how many test cases you will have in TestLink every year and the number of attachments in each test case. The attachment is stored in the file system and the test case data is stored in the TestLink database.
Why is a post installation migration script required?
A post install migration script moves the TeamForge users to TestLink. This is a one time activity when you install the integration for the first time. When the users are moved to TestLink, the permissions for the migrated users have to be added or modified in TeamForge.
What happens if I run the post installation script for the second time by mistake?
No worries. The script will not corrupt any data and it will ignore the users who are already migrated.
Can I install Jenkins and Selenium in the same box as TeamForge?
It is recommended to install Jenkins and Selenium together on a separate server for scalability. In Jenkins, there is a job created for every test plan. If the project has many test plans that are executed in parallel, then there is a need to create many jobs. Running multiple jobs in parallel consumes more resources in the server.
Why is the defect tracker plug-in disabled in TestLink after installing the integration?
This is because the TestLink integration developed by TeamForge would want the users to use TeamForge for defect tracking.

Using the TeamForge tracker for defects provides traceability within the TeamForge ALM space starting from requirements, defects to test cases.

I am an existing TeamForge user and recently installed TestLink integration. How can I associate new test cases with the old requirements?
You can manually associate test suites and test cases with the requirements in TeamForge.

For example, if you are an existing user and have recently installed TestLink integration, you can manually create and associate test suites and their corresponding test cases with your existing TeamForge requirements. Similarly, you can manually associate new requirements in TeamForge with the existing test suites and test cases in TestLink. For more information about manual association, see Associate a tracker artifact with TestLink (Manual)

Can we version the test cases since we modify them many times?
Yes, versioning is available at every test case level. Use the New version button available on the Test Specification page of TestLink to create a new version of a test case and also compare the difference between test case versions.
Where can I get more information about the TeamForge - TestLink integration?
The project is hosted in the following path:

https://ctf.open.collab.net/sf/projects/ctftestlink/

Note: This project is currently not accessible to non-CollabNet employees.

A lot of additional information is available in the wiki, with links to presentations and blog contents.

How can it help?
With the recent addition of TestLink, a widely adopted open source product for test management, TeamForge now has the added capability to create test case trackers and associate them with requirements. With TeamForge, users have the ability to execute test cases and store test results.

TestLink utilizes a tracker to store test cases and also tie test plans into build. By integrating TeamForge with TestLink the test management features are available in TeamForge as part of an end-to-end ALM solution.

What type of test reports can we expect in TeamForge?
Two additional reports have been created which you can add to the Project Home page. They are:
  • Deviation Report: This report flags requirement changes that did not have corresponding changes in test cases.
  • Requirement test case evaluation report: This report displays all the requirements which failed execution.
I would like to know how Selenium communicates with TestLink to update the status of test cases and API ? Is there an available API to contact TestLink?
The automation works through integration of three different tools. Jenkins, Selenium and TestLink plug-in for Jenkins. The Jenkins TestLink Plug-in communicates with TestLink through an open source Java API. After the build is complete, the plug-in executes all test cases that are marked "Automated" by invoking the test file that is specified in the test case. The user-defined field in the test case has the test file that Jenkins pulls and executes. There should be a one-to-one relationship between the test case in TestLink and the automated test file. The test file is an automation script that represents a test case and the script can be a plain Java class, JUNIT class or Selenium Java class. Once the test files are executed, the test case is updated with the result 'pass' or 'fail' through the plug-in. The test file in our reference automation is Java Selenium file generated through Selenium RC or through manual Java coding of test case using the Selenium API. You can use other CI tools and test files from different automation tool to make the whole automation work.
Do you have an automation reference with JMeter?
Not at the moment. We are in the process of exploring JMeter for our reference automation.
Please show defects generated automatically with selenium.
This feature will be available in our interim release; currently it is disabled due to the feedback received from pilot customers. The reason is that there are many scenarios where the automation will generate duplicate defects. For example, if a CI tool is used for the build and test runs daily automatically, then the defects will be created. If there are failures and the team is not able to fix the defects before the test runs again, then duplicate defects will be created. So we are looking at all the scenarios before turning on this feature in the next release and also trying to make the defect creation more intelligent to update an existing defect instead of creating new defects.
How is selenium defects added?
The selenium defects are added into TestLink through Jenkins TestLink Plug-in. In your Selenium Java test case, there is a one-line code snippet that needs to be added for a success scenario and a failure scenario in the Selenium Java script . The assert snippet is recognized by Jenkins TestLink Plug-in to update the corresponding test case in TestLink with the right status.
My test results are stored in XML. Is there a way to import these test results into TestLink. I am using C# to create XML in one of the participants response.
There is an import feature in TestLink where you can import XML results generated from other automation tool into TestLink.
We currently use both TeamForge and TestLink. Is there a way we can glue users?
The integration has to be installed to glue existing TeamForge and TestLink systems to work as one. Next, there are migration scripts to migrate data between TeamForge and TestLink. The users who are in TestLink will be migrated to TeamForge and vice-versa. The script has to be run only once after installing the integration. After that, any project, users, permissions and roles are automatically synchronized between the two systems. If the same user name in TestLink exists in TeamForge, then there is a user conflict that the migration script will report and the specific user will not be migrated. The scenario below will help you understand how to resolve the conflict.

For example, John is a user who exists both in TeamForge and TestLink. TeamForge already has Project A and TestLink has Project B and they are in two separate systems. The one time migration script moves Project B from TestLink to TeamForge. User John has access to Project A in TeamForge only. To fix this issue, the TeamForge admin has to check if the conflicting user John and the TeamForge user John are the same. If it is the same person, then John already has access to Project A in TeamForge and so he has to be given permission to Project B so that the conflict gets resolved. Now John will have access to both Project A and Project B.

We have a test result set at TestLink to 'Failed'. Is it possible to automatically fill some of the fields at the created defect, specially the Assign To field.
Currently, we are pushing the fields such as the time of execution, test plan name, build and user comments to TeamForge. According to the feedback received so far, automating AssignedTo is difficult because there are cases where the person who is going to work on a defect may need specialized skills that only one person in a team can fix it; also, it is hard for the system to identify AssignedTo. In case the system identifies AssignedTo, there may be cases where the person to whom the defect is assigned may be too busy; then we have to work on work load distribution to see who is free and then assign. I am open to receiving more feedback from users on what other field that will make sense to be added to the auto generated defect, so that we can plan to incorporate the request in the future roadmap.
Is it possible to configure the TestLink project to generate a specific defect tracker type? For instance, we have functional defects and performance defects.
During initial configuration of the TestLink integration with TeamForge by the project admin, TestLink expects the TeamForge user to provide all the tracker ids of all requirement trackers. For example, if you have epic, story and task trackers, then you need to provide a tracker id for all three trackers with comma separation during the configuration. Similarly for a defect tracker, the integration accepts only one defect tracker id and you cannot have multiple defect trackers.
Which TestLink version supports synchronization with TeamForge?
The integration works on TeamForge 7.0, 7.1, 7.2 and 8.0 with TestLink version 1.9.11.
Can you fetch test results from TestLink (and from multiple projects in TestLink) to dashboard web page automatically?
At the moment, the integration maps one TeamForge project with one TestLink project. Currently, you cannot report out of multiple TestLink projects into one TeamForge project.
Can we shut off auto defect generation?
Yes, you can turn on/off auto defect generation at a project level.
What if I don’t want a test suite created for each requirement?
You can select the TestSuite field to ‘NONE’ instead of ‘Create’ while creating a requirement and this will not create Test Suite in TestLink. This is controlled at every requirement artifact level.
How to ensure quality test cases? Is there a way to indicate that test cases have been reviewed?
TestLink provides custom fields where multiple custom fields can be created that can be used as a checklist. The reviewer of the test case has to review a test case and update the checklist one-by-one. The check list can be a single text area or multiple check boxes. If all the items in the check list are complete, then the user can mark the test case as approved by clicking another custom field check box.