Mobile Application Testing – 01 Synergy

April 4th, 2012 by Rahul No comments »

01 Synergy offers a complete and comprehensive range of Mobile Application testing services from Unit Testing to User Acceptance Testing. Complexities across handset makers, carriers, locations and operating systems has made building bug-free mobile apps really difficult.

Our areas of expertise include:

  • Requirements Capture and Analysis
  • Test Planning
  • Test case Design
  • Test Execution
  • Defect Tracking & Management
  • Reporting
  • Test Metrics

01 Synergy offers a wide range of Mobile Application testing services, including:

  • Functional Testing
  • Security Testing
  • Load & Performance Testing
  • Localization Testing
  • Usability Testing

Our QA professionals can help you with all your Mobile App testing projects,  including:

  • iOS Application Testing (iPhone, iPad, iPod Touch)
  • Android Application Testing
  • BlackBerry Application Testing
  • Windows Phone 7 Application Testing

01 Synergy is here to help, if you have a need to discuss Mobile application testing, agile testing, do count on us to help. Visit us online at www.01sqa.com or send us a mail here: mobile.testing@01synergy.com

China’s Top Supercomputer Dramatically Speeds Genomics Pipeline

June 29th, 2015 by Amrinder No comments »

It has been somewhat difficult to ascertain what problems the fastest supercomputer on the planet has been chewing on since it was announced in 2013, but there are some signs that China is now pinning the machine’s mission on the future of genomics, among other areas.

This July, the bi-annual list of the top systems in the world will refresh, and there is little doubt that with well over 33 petaflops of sustained performance (and approximately 54 petaflops peak capability) at the ready, boosted by 16,000 nodes that sport two Ivy Bridge generation processors and backed with three Xeon Phi coprocessors, the 3,120,000 core beast will retain its spot at the top. But the same questions about how the machine is being used—and more specifically, how China is doing the software legwork to be able to fully leverage such a monolith—will likely resurface during the list unveiling at ISC ’15 in Frankfurt.

It is safe to assume plenty of what happens on TIanhe-2 stays within hushed circles, but what a team has been able to showcase promises a big breakthrough for genetic research and demonstrates how application teams are rallying to meet the demands and opportunities of dramatic core count, memory allocation, storage, and other resources boosts, not to mention the needs to keep the machines fully fed.

A team working on the Chinese supercomputer was able to achieve a 45x speedup on a single node of the system without a loss in precision by refining their approach to parallelization of a critical part of the genomic analysis pipeline. By revamping how a commonly used SNP detection framework shares the load via the team’s mSNP framework, they could take this single node performance and scale it to just over 4,000 nodes of the Xeon Phi-boosted super.

The existing tool is called SOAPsnp, which the team says took more than one full week to analyze for one human genome with 20-fold coverage. To put the critical nature of this step in larger genomics workflow into context, consider the role of SNP detection in the future of medicine. The single nucleotide polymorphism (SNP) is the genetic equivalent of a bit flip, a spot in the DNA sequence where variation can be spotted. These are useful to identify in sequences since they can pinpoint vulnerabilities to certain diseases, map more targeted pharmaceutical routes, and highlight other genetic markers of importance. And these are not few and far between—there are several million SNPs that have been identified in the human genome alone. While we are more concerned with, say, sequential code rather than gene sequences here at The Platform,this is useful background for the main point of conversation here, especially since speeding the detection of SNPs provides a significant performance, and thus efficiency advantage for large-scale systems doing complex genomics research.

The researchers note that they achieved this speedup via mSNP by fully harnessing the coupling of the Ivy Bridge processors with the Xeon Phi. Interestingly too, this is one of perhaps only a few such projects to speed this part of the workflow as the researchers state that they are not yet aware of any parallelized variant of the SNP detection process that can take advantage of the Xeon Phi (although there are GPU accelerated frameworks targeting the same workload, some of which have also been developed in China). As the team describes in great detail, they “redesigned the key data structure of SOAPsnp, which significantly reduces the overhead of memory operations, then devised a coordinated parallel framework, in which the CPU collaborates with the Xeon Phi for higher hardware utilization.”

As they will be describing in more detail in July during the Top 500 supercomputing announcement week in Frankfurt, the NUDT researchers took these optimizations and proposed “a read-based window division strategy to improve throughput and parallel scale on multiple nodes,” which again, represents a first on the Xeon Phi.

As the news will potentially shift away from the huge leaps in computing power for this list (assuming no influx of further Tianhe-2 nodes this Top 500 cycle), there will likely be more emphasis placed on the real application performance of the massive machine. The argument has often been made that while the system is no doubt impressive, it is incredibly complex, comprised as it is by not just a stunning number of cores to contend with, but a front-end system comprised of the SPARC variant (Galaxy FT-1500) CPUs as well as a homegrown Linux variant, both of which were developed at the Chinese National University of Defense Technology (NUDT) with help from Chinese IT manufacturer and integrator, Inspur.

And on the applications and usability front, one can expect the same questions about the other purposes of the machine, especially in the wake of the recent U.S. block on exports for supercomputers in China. While news like the genomics work do highlight that indeed, real and valuable progress is being made on large machines (no matter where they are based), the tone of the conversations about how this behemoth will prove itself over time are likely to take a different turn, especially with the blockade and what looks to be the continued dominance of the Chinese super at least the next year—although remember, in 2013, no one saw this big system coming. Especially not with such an unforeseen peak petaflop rating.

Source:http://www.theplatform.net/2015/06/29/chinas-top-supercomputer-dramatically-speeds-genome-analysis/

Some midsize firms would rather handle data internally

June 29th, 2015 by Amrinder No comments »

Midsize money managers are looking within for their data management, countering the broader trend of those firms outsourcing overall middle- and back-office functions.

Regulations stemming from the Dodd-Frank Wall Street Reform and Consumer Protection Act and Basel III are requiring money managers to keep a closer watch on their data, which can be anything from performance measurement to security categorization and compliance. Large bulge-bracket managers already have the capacity to handle these duties internally, sources said, but it’s the midtier managers — those with $10 billion to $100 billion in assets under management — that face an issue of how and who should monitor that information.

“Firms are looking at what their high-value functions are, what they need to succeed,” said Marc Mallett, Naperville, Ill.-based vice president of product and manager services at SimCorp, an investment management software provider. “Having a good investment book of record, getting the best data they can for good investments, transparency for regulatory and client demands, that’s why firms are moving” toward internalizing data management.

A SimCorp survey of 88 money managers released June 23 showed 53% aren’t confident that investment performance figures they report are accurate, and 80% said portfolio managers do not receive investment performance numbers based on intraday position calculations. But 59% are able to determine trades, prices, foreign exchange rates and security classifications behind their portfolios’ performance numbers. The managers surveyed had a combined $22.5 trillion in assets under management.

“We’ve found that getting at and managing the data is the root cause of investment inefficiency,” said Brent Beardsley, senior partner and managing director at The Boston Consulting Group, Chicago. “It means managers have to do a lot of reconciliation activity which is costly for them. Most managers are paying more attention to this. It’s more of a capability and quality issue. (Data are) so embedded in what they do, they want to make sure it’s done right and done quickly.”

Some midtier managers have created chief data officer positions, often senior level positions. “That’s the governance piece. We’ve been seeing that role growing” in money management, said Paul McInnis, director of product strategy at data manager Eagle Investment Systems LLC, Wellesley, Mass. “Ten years ago, that position wasn’t well-defined. Now there’s a real centralized concern (at managers) on how data is managed. They realize that data is an asset, and are managing data accordingly.”
Almost all brought in

Brian Baczyk, chief data officer at Conning & Co., Hartford, Conn., said the firm handles almost all its data management internally, only outsourcing a small portion regarding identifying legal entities related to the securities it holds.

In the past six months, Conning formalized the organization of its data management group among the three staff members who had been working on it and other tasks. All three have investment management experience, Mr. Baczyk said.

Mr. Baczyk said cost, competitive advantage over other managers and control of its own data were the motivation for Conning to handle data management internally. “We know what’s important to us,” Mr. Baczyk said. “(External) providers would have to hit that mark for us, and we think we can do that better. There’s also a bit of a competitive advantage. We have a set of unique requirements that general providers don’t address.” Conning had $95 billion in assets under management as of March 31.

As for control of data management, Mr. Baczyk said that’s driven by compliance requirements. “Regulators demand accountability from our firm directly rather than shift the responsibility to a vendor,” he said.

That accountability could be seen as extending to cybersecurity as well, said Mr. Baczyk. “Regardless of how a company manages their data — either in-house or (has) outsourced data governance functions — a company still needs to have its own people with the right responsibility and authority in place to address cyber risk.”

Others said cybersecurity hasn’t been a driver for internal data management — though it could be. “I’ve not seen cybersecurity as the primary reason for internalizing data management,” said Boston Consulting’s Mr. Beardsley. “It’s potentially a factor, but in my conversations in the market, I’ve not heard of this being mentioned as the driving force.”

Jack O’Connor, head of business development and client service at DDJ Capital Management LLC, Waltham, Mass., also said that cybersecurity concerns aren’t a motivation for internal data management, although “the issue of cybersecurity is causing us to evaluate our systems and security protocols on a regular basis to ensure we are employing the best systems and methods to protect data.”

DDJ has refined its internal data management capabilities recently, although it’s been monitoring data internally since the firm was founded 19 years ago, said Mr. O’Connor. “While outsourcing this type of oversight may be desirable to some firms, especially from a cost perspective, we believe that our firm would still need to dedicate a fair amount of time overseeing the process in an effort to ensure the most reliable data,” Mr. O’Connor said.

Mr. O’Connor said DDJ chose to internally manage its data “because of the nature of the types of investments that we make on behalf of our clients,” adding in-house data management “is in (clients) best interest.” DDJ managed about $8.3 billion in high-yield fixed income and leveraged loans as of March 31.

DDJ does not have a chief data officer, Mr. O’Connor said; those duties are handled by Joshua McCarthy, the firm’s general counsel and chief compliance officer, who “has played an important role in the development of many of the improvements the firm has made around data integrity/management,” Mr. O’Connor said.
Adapting to insourcing trend

Outsourced data management firms have adapted to the insourcing trend, providing software for managers who decide to manage internally. However, some money managers, like Conning, have built their own proprietary data management technology and do not use software from outside providers, Mr. Baczyk said.

“Whether fully outsourced or insourced, managers still are the consumers of the data,” Mr. McInnis said. The money manager might retain analysis of the markets, reconciliation of accounts, performance attribution and the classification of securities. “It’s bespoke to what a particular client wants,” he said.

SimCorp’s Mr. Mallett said control is what drives midtier managers to handle data management, as is still the case with much asset servicing despite the overall outsourcing trend. “Some asset servicing has been outsourced recently, but the trend in data is heading back in the other direction,” he said. “In a post-crisis world, at the end of the day, managers want more control of the data and need to meet the demand of transparency. Whether it be from managers’ clients or regulators, folks are demanding to know that your positions are accurate, where they are, and that risk controls are in place. When you outsource any of that, you’re giving up control.”

Source:http://www.pionline.com/article/20150629/PRINT/306299979/some-midsize-firms-would-rather-handle-data-internally

Telstra offers free porn filtering with Telstra Broadband Protect

June 29th, 2015 by Amrinder No comments »

As lobbyists push for mandatory porn filtering, Telstra has overhauled its opt-in “clean feed” and bundled it with other family-friendly features.

Mandatory porn filtering will certainly be next on the agenda now that Australia has anti-piracy laws for blocking sites like The Pirate Bay. There’s no need to wait if you simply want to block access to the seedy side of the internet from your own home.

The idea of mandatory filtering understandably doesn’t sit well with some people, as censorship is a slippery slope. Some people will argue that optional clean feeds add momentum to the push for mandatory porn filtering. Others will argue that services like Telstra Broadband Protect actually make it easier to argue against mandatory filtering, because it’s less necessary when there are free and easy to use opt-in filters for everyone who wants them.

Telstra Broadband Protect is an optional extra with Telstra’s new broadband plans, which are available on Tuesday to coincide with the launch of the nationwide Telstra Air Wi-Fi network. If you want to stick with your old Telstra broadband plan you can add Telstra Broadband Protect for $9.95 per month. Alternatively you can get some of the same features for free, regardless of your ISP, using services like OpenDNS Family Shield and Norton ConnectSafe.

There are three main components to Telstra Broadband Protect; parental controls with website filtering, antivirus and “social network protection”.

The website filtering is designed to automatically block websites known to host malicious content as well as scams, working hand-in-hand with the anti-virus software which is available for up to six computers and Android mobile devices. Along with these protections, the account holder can enable parental controls to block inappropriate content like porn.

The website filtering features protect every device in your home, which means the filtering must be performed at the network or modem level – perhaps via DNS redirection. This kind of filtering is far less likely to have an impact on internet speeds and general performance than installing clunky filtering software on every device.

Parents might also appreciate the “Homework Time” feature which lets you limit how much time children can spend on the internet as well as block access to entertainment sites when the children should be studying or sleeping. I’ll double-check with Telstra, but I expect this is done by the modem using the IP address of the device you want to control.

Then we come to the social network protection, which is a lot more intrusive and seems to cross the line between protecting and spying. To quote Telstra;

“The software allows parents to help manage their children’s social networking activity. Parents can choose to receive notifications when their child connects with new ‘friends’, sees or posts images. Parents can also select to view other posts on their child’s social network accounts to help make sure everyone is playing nice. They can also choose to see what videos are being looked at on YouTube, Facebook and Twitter to be on top of age-appropriate content.”

I can see how this is a useful way to keep an eye on stranger danger and cyberbullying, but I question whether spying over a child’s shoulder on everything they do is the best way to deal with these issues. The most important things you can do to keep your children safe on the internet is take an interest in what they’re doing online, talk to them about the dangers and encourage them to come to you when there is a problem. You can’t just hand those jobs over to filtering software.

The same with porn filtering – you can’t simply outsource your parental responsibilities to a machine. Services like Telstra Broadband Protect are useful, but not foolproof, when it comes to stopping young children accidentally stumbling onto inappropriate content. It’s a whole different ballgame when you’re dealing with older children who are actively looking for ways to beat your web filtering efforts. Most of the tricks for beating government piracy filtering will also help teens bypass home roadblocks like Telstra Broadband Protect, so you’re unlikely to outsmart them. At the end of the day the keys to online safety are communication and trust.

Source:http://www.smh.com.au/digital-life/computers/gadgets-on-the-go/telstra-offers-free-porn-filtering-with-telstra-broadband-protect-20150629-gi0h2t.html

Get Adobe Flash player