Announcement

Collapse
No announcement yet.

Interns reporting their activities

Collapse
This topic is closed.
X
This is a sticky topic.
X
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Interns reporting their activities

    Ladies and Gentelmen,

    We expect from your internship to have a nice section in your resume, which improves your marketability. To achive that we need to know as many small and big things about your SQA activities as possible. This is why:
    • At the end of each day you write a list of things done today
    • At the end of each week you compile the weekly report and email it to me

    That is not a "nice to have" thing. It is the "must have" one.

  • #2
    To reinforce the requirement described above we made the decision not to schedule resume preparation appointment without weekly reports submitted first.

    It will benefit the graduates in the form of having better resume when they enter the job market.

    Comment


    • #3
      Sometimes I see interns experiencing difficulties in formulating what they do on everyday basis. To help with that I am placing down below the list of typical activities of a software tester:
      • Implement and Provide input for Test Plans.
      • Design Test Cases.
      • Create Test scenarios.
      • Check / Review the Test Cases (TC).
      • Keep track of the new requirements.
      • Perform functional testing duties.
      • Perform regression testing duties.
      • Report bugs, track defects, resolve issues with the developers.
      • Track problem reports using a bug-tracking system.
      • Produce test evaluation reports.
      • Participate in software walkthrough.
      • Design Test Plans.
      • Design Test Cases.
      • Create Test scenarios.
      • Check / Review the Test Cases (TC).
      • Keep track of the new requirements.
      • Escalate the issues about project requirements.
      • Perform functional testing.
      • Preparing test data
      • Perform regression testing
      • Track problem reports using a bug-tracking system.
      • Produce test evaluation reports.
      • Participate in software design reviews
      • Participate in software walkthrough.
      • Manage testing environment.
      • Lead and facilitate testing.
      • Provide support to junior team members.
      • Run and Maintain Test Automation Scripts for regression testing.
      • Perform CRUD database testing using SQL.
      • Conduct internal trainings
      • Provide test planning and assign task to all Testing Team members.
      • Verify that all Team members have sufficient work.
      • Organize the meetings.
      • Prepare the Agenda for the meeting.
      • Track and report testing activities.
      • Check for timely delivery of different milestones.
      • Design or evaluate Test plans
      • Verify content and structure of all Testing documents and reports.
      • Motivate team members.
      • Organize internal trainings
      • Provide cost/progress/test status reporting
      • Designed test scenarios for automation.
      • Create test scripts utilizing automated test tools.
      • Design concurrency, load and regression testing suits.
      • Design and Develop API test cases.
      • Design and write automated regression test packages using scripts, high level languages, and a commercial test tool.
      • Check / Review the Test Cases (TC).
      • Perform regression testing
      • Track problem reports using a bug-tracking system

      Comment


      • #4
        And that is weekly report submitted by one of the fellow interns:

        May 5- 9.
        • Worked on different test-matrixes, which is testing different features of XYZ Product
        • Tested Installation of CWS, verifying connection to server via admin login, operator login, and guest login.
        • Tested for installation of XYZ printer delete utility, feature testing of XYZ printer delete utility and testing for uninstallation of XYZ printer delete utility.
        • Installed the CWS and printer delete utility on all operating system including: Vista 32 bit, VIsta 64 bit, Intel Mac, Win XP, Win XP 64 bit, OSX, Win 2000.
        • Reimaged the computer using reboot disk to different OS.
        • Learned how to map a network and how to install a new version of fiery on the server by using the Nero system to burn ISO images which was received through e-mail, on CD's or DVD's and then using the USB Flash cd to transfer the content of the cd's to USB drive and then using that install the new version of fiery on the server. I think this is called installing the new release of fiery on server, which is done almost every week. After that I would run the set up on engine
        • Installed the new release of fiery user software to which is released every week to test the CWS and printer delete utility.
        • Learned to run a set up on server after installing new version of XYZ.

        Comment


        • #5
          One more report, which really was a nice surprise to me:

          Test www.xyz.com – web based application powering the next generation of social miniblogging and media interaction. Koollage offers a refreshingly easy way to aggregate images, video, audio, web content, rss feeds, annotate with short text summaries and instantly publish to your world.
          ● Studied Requirement document.Important functionalities, new features required for the next release and also areas of application developers think have greater risk aspect
          ● Checked and reviewed bugs from already existing bug tracking system.
          ● Performed regression testing on major functionalities of the application.
          ● Specific test cases were written (steps to reproduce) on bugs that were claimed to have fixed by developers but were found during regression testing.
          ● Requested certain features that were not already implemented.
          ● Created various test scenarios that were not specified in the fuctional spec document
          ● Installed Safari 3.1 for testing application in this browser.
          ● Performed testing in 3 major browsers IE 7.0, Mozilla Firefox 2.0 and Safari 3.1.
          ● Tracked the problem reported in the bug tracking system.
          ● Familiarised new bug tickets reported by several other testers in the bug tracking system.
          ● Worked on different test matrixes which reinforced the relation between buisiness rules and test cases.
          ● Wrote test summary reports on various major features of the application like Login, Registration, Pod functionalities and Search tools.
          ● Experienced the misbehaviour of each browser in response to some of the bugs
          ● Worked on same feature of the aplication by opening 3 browsers on the same desktop side by side and observing result of execution of the feature.
          ● Wrote test cases for features that were not sepcified in the functional spec.
          ● Close study of the product spec document allowed me to find errors in wordings of messagebox.
          ● Defect report for misleading UI was written.
          ● Regression testing was performed on bugs that were reported as fixed by the developers on modulles I were assigned: Login, Registration, My friends and Search Tools.
          ● Functional testing was performed on new modulle assigned.
          ● Created test scenarios with specific test data.
          ● Learned some of the added features from new product requirement document.
          ● Defects were tracked on fixed bugs and bugs were reported timely.
          ● By reviewing the content and structure of the company's bug report , I was able to produce better bug reports that used the terms that the company was familiar with.
          ● Performed testing little out of the way which resulted in runtime error. For instance, invalid day of the month like feb 30th threw a java exception error.
          ● A feature which would create an impact on every user who uses this application did not show the right result.Testing the module “My friends” showed number of friends listed incorrect.
          ● Different test cases and test scenarios were written for testing pod popularity like number of views,pod ratings,date of pod creation and name of owner of the pod.
          ● Tested a new feature that was implemented and was ready for testing;the blog conversion to pod.
          ● Test cases and test scenarios were written for features that are the highlights to this product like media and text compatibility, tags added any time to any frame.
          ● Different test data were provided to change the appearance and look of the pod by providing different color to title, borders and the dashboard.
          ● Learned to work with different media Understood that audio, video, images and text in a pod were featured differently.
          ● Test cases for pod manipulation like deletion, addition and pod navigation.
          ● Tests were performed from users point of view which were not specifically mentioned in requirement document and found that the condition failed such as when a frame was added as an index frame it was repeated twice.
          ● Performed User Interface testing on different browsers. Alignment of text area and text fields were not proper and inconsistent with different browsers.
          ● While all features worked with out giving any major errors it was found that there were inconsistencey in the behavior of certain features. Blogs from certain websites could be converted to plog while certain other websites did not work. Similarly the order in which the frames were added were inconsistent.
          ● Had a meeting with QA lead and other team members to discuss new features and functionalities
          ● Various test cases and test scenarios were written on pod sharing with users,Klubs and groups.
          ● Feature to add a pod to the users favorites were also tested with various conditions to be satisfied. Social and shared pods could be only adeed while private and standard could not be added.
          ● Comments could be added to pod and regression tests on comment counter (increment or decrement) had to be performed.
          ● Studied the functionality of two new features “Summary” and “Comment”which opens a text editor.
          ● Performed tests and wrote test cases for text editor to see if all functons specified in the requirement document performs well. Such as Bold, italics, Underline, font type, font sizespell check.
          ● Mouse over icons test cases were written too.
          ● Assigned to perform quick smoke testing on the build and regression testing (repeatative testing) on important features.
          ● Realised the importance of test harness and sanity test for testing the build and finding out if the build functions okay
          ● Several testuser and testpods were created to test the proper podfunctionalities.
          ● Test cases were checked and reviewed multiple times until it was made thorough most of the major bugs were either reported or fixed.
          ● Found account validation from the same browser is required to get the account going active.Validation from any other browser to the account will not allow another browser to operate that account.This was a fatal error.
          ● Was constantly keeping track of new features with the updated requirement document and discussion with QA lead to fix some of the crucial bugs like permalink when clicked from the title frame gives an error page.
          ● Learned a new feature:Promotion pages of converting sites to pod.
          ● New test site was created for testing.
          ● Since the application is moved to bigger and stable server this week functional and regression testing was performed aggresively to make sure the modulles work okay and does not cause major break
          ● Reviewed testplans and found that there were some test cases that were written earlier for old features which have been either removed or modified currently.Certain test cases are no longer valid as they were written for modules that were intended to exist previously and does not exist anymore.
          ● Modified the document (Test plan) with current features and terms.
          ● Existing test cases had to be replaced with new testcases as the name of the previous icons or buttons have been changed.
          ● A feature that performed really well in other browsers did not work good on IE (invite a friend).
          ● Had a discussion with QA lead.Talked and learned about many new features and the transition of the application and database into bigger servers
          ● Familiarised with igooglers and Marklet feature.
          ● Found there was cross browser issues for a build that was released
          ● Learned about a new technology called lazlo technology.This is particularly used to support large number of users to provide quick response. performance and load handling is made better.
          ● Performed functional testing on major modulles to see if they worked okay performed without any intrruption on the new technology.
          ● Did a complete round of testing in major modulles Community page, profile(public and private) on 2 Prod servers.
          ● Found that some actions interrupted the major functionalities like a link not working. Send a friend invite to this koollager
          ● Tracked bugs and report issues using bug tracking system. Performed 540 test cases.

          Comment

          Working...
          X