Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 8 Next »

Load testing uPortal and other Jasig projects

Load Testing

Participants:  Eric Dalquist. Joe Swanson, Alan Takaoka, Jay Patel, Tamra Valadez

Leonar Fuller from Unicon good resource - load testing for Pearson (largest uPortal installation - 100K concurrent users, Over Million user base).

UW - user base 66-70K unique users per academic per term, concurrent  3 requests per second.  Spring (1/1-5/17) term 88,000 unique logins, 7M total logins. 

Portal load depends on how your portal works.  Example:  email client within portal.  Madison's portal is a dashboard, user logs in and 70% of the time the user clicks on one other thing - they have found what they need, or have clicked to another site.  

UW - a lot of web proxy content.  Utilizing pub cookie for login mechanism.  No proxy authentication issues.  Cyprus document server which explicitly trusts the uPortal addresses, and information is provided.  Could only load test 1 server because the pub cookie system couldn't handle it.  Don't want to put to much fake load on secondary systems.  

Eric will post JMeter test script - to provide an example.

How did you choose JMeter - synthetic user count never mathces your capacity.  Aim for request per minute - build a script to achieve this and add thread till you reach your desired capacity.

1.  CPU based test - requests

2.  Concurrent user load - script logs out only 10% of the time, keeps user in which increases the load

JS - it depends on what your portal is used for - ours has registration (which is heavily used), 4 weeks of the year you are at maximum capacity.  Basically, our portal is a stepping stone, load is manageable year round, however during registration additional servers added.  Are you measuring the performance of the portal or the system integrated with the portal?  2 stage approach, channels load scripts,  Perl http web test, HTML unit (java framework) - traverse menu trees - test portlet load.  Problem with this approach is that when you add a new channel the script will have to revised.  Average 40-53K unique visits, 1.8M page views.  

ED - many people miss the total number of unique user logins, e.g. DB issues uncovered as you hit a certain number of unique users.  SQL query failed.  Number of rows in a DB table - portlet entity table (user layout preferences), 1 row per user per portlet.  So a user would have approximately 70-100 rows, depending on the number portlets.  Poorly designed SQL running against a 5-6M user base caused a failure.  Need to know the total number of users that have been through the system.

How do you get enough unique users through your load testing?  

Even if more than 1 session open, you are still one unique user with one layout.

UChicago is testing Shib with uPortal.  Biggest challenge is standing up a new system.   Don't spend too much time trying to test all cases, watch your statistics.  Perform load testing during your phase rollout in the background. Feed metrics into your load testing.  Scale up as needed. 

Statistics - uPortal has an event model.  In uPortal 3.1 you can output these events to a DB.  Outside tool at UW, summarize at 5 minute intervals .  This will be an insane amount of data, move to a separte DB strongly recommended.  Tracks (per group):  Channel or portlet requests (targeted, render time, etc.), log in counts per group, concurrent users - event from user within that 5 minute window, in a day how many user log in N times.  Number of times a tab has been requested.  Portlet render times.

UW - ED has plain old Oracle SQL queriers, he will push out to Jasig Wiki. 

Perhaps utilizing BO would be a good solution.  

Steps:  

1.  Turn uPortal on

2.  Point to external DB for storage of raw data

3.  Create the aggregate data tool - setup (2 weeks).

4.  Create reporting mechanisms

 
JMX, JConsole is a handy tool to utilize.  Available - DB connection tool.  Utilize sticky sessions with round robins. 



  • No labels