Logo
Alpher Online
 Current page : Home

301 Moved Permanently

Moved Permanently

The document has moved here.


Apache Server at killexams.com Port 80
Pass4sure P8010-004 Practice Test | Latest Pass4sure Questions Answers of IBM P8010-004 - alphernet.com.au

P8010-004 | IBM Commerce Solutions Order Mgmt Technical Mastery Test v1

Updated P8010-004 Practice Test @ Killexams

Complete Practice Exam is provided Here   |   View Blog Article Home

P8010-004 - IBM Commerce Solutions Order Mgmt Technical Mastery Test v1 - braindump

Vendor IBM
Exam Number P8010-004
Exam Name IBM Commerce Solutions Order Mgmt Technical Mastery Test v1
Questions 30 Q & A
Recent Update January 10, 2019
Free PDF Download P8010-004 Brain Dump
Download Complete PDF Killexams P8010-004 Complete Document


P8010-004 Dumps and Practice software with Real Question


High Quality P8010-004 products: we've our specialists Team to make sure our IBM P8010-004 exam questions are usually the latest. They are all very acquainted with the exams and exam simulator middle.

How we keep IBM P8010-004 assessments updated?: we've our special approaches to realize the modern-day exams data on IBM P8010-004. Sometimes we contact our companions who're very acquainted with the exam simulator center or every so often our clients will e mail us the most current comments, or we were given the cutting-edge comments from our dumps market. Once we find the IBM P8010-004 exams changed then we update them ASAP.

Money returned assure?: if you really fail this P8010-004 IBM Commerce Solutions Order Mgmt Technical Mastery Test v1 and don’t want to look ahead to the replace then we will come up with complete refund. But you must ship your score report to us in order that we can have a check. We will come up with complete refund right now for the duration of our operating time when we get the IBM P8010-004 rating record from you.

IBM P8010-004 IBM Commerce Solutions Order Mgmt Technical Mastery Test v1 Product Demo?: we have each PDF version and Software model. You can check our software page to look the way it looks like.

killexams.com Huge Discount Coupons and Promo Codes are as beneath;
WC2017 : 60% Discount Coupon for all exams on website
PROF17 : 10% Discount Coupon for Orders more than $69
DEAL17 : 15% Discount Coupon for Orders extra than $ninety nine
DECSPECIAL : 10% Special Discount Coupon for All Orders


When will I get my P8010-004 material once I pay?: Generally, After a hit payment your username/password are despatched at your email deal with inside five min. But if there is any postpone in bank side for charge authorization, then it takes little longer.


P8010-004 dumps, P8010-004 Discount Coupon, P8010-004 Promo Code, P8010-004 vce, Free P8010-004 vce, Download Free P8010-004 dumps, Free P8010-004 brain dumps, pass4sure P8010-004, P8010-004 practice test, P8010-004 practice exam, killexams.com P8010-004, P8010-004 real questions, P8010-004 actual test, P8010-004 PDF download, Pass4sure P8010-004 Download, P8010-004 help, P8010-004 examcollection, Passleader P8010-004, exam-labs P8010-004, Justcertify P8010-004, certqueen P8010-004, P8010-004 testking


View Full Exam »

Customer Reviews about P8010-004

Testimonials Here   |   View Vendors, Tracks Home

P8010-004 - IBM Commerce Solutions Order Mgmt Technical Mastery Test v1 - Reviews

Our customers are always happy to give their reviews about the exams. Most of them are our permanent users. They do not rely on others except our team and they get exam confidence by using our questions and answers and exam simulator.

simply those P8010-004 modern day dumps and examine manual is required to pass the take a look at.

I exceeded per week ago my P8010-004 confirmation test. Killexams Q&A and examination Simulator are pleasantobject to purchase, it clean my topics outcomes in an exceptionally time, i was stun to understand how terrific they will be at their administrations. Identification want an excessive amount of obliged regarding the high-quality item that you virtuallyhave that aided inside the arrangement and using the check. That is frequently out and away the gold standardthorough and nicely little bit of composing. A superb deal obliged

simply try those modern-day dumps and achievement is yours.

Some nicely men cant convey an alteration to the worlds manner but theyre capable of great permit you to know whether you have were given been the simplest man who knew a manner to do that and that i need to be regarded in this international and make my own mark and i have been so lame my entire way but I recognize now that I desired to get a bypass in my P8010-004 and this may make me famous perhaps and yes i am brief of glory but passing my A+ assessments with Killexams became my morning and night glory.

Here is good source of Latest dumps, accurate answers.

i have to mention that Killexams are the excellent location i will always rely on for my future assessments too. in the beginning I used it for the P8010-004 exam and exceeded effectively. at the scheduled time, I took 1/2 time to complete all of the questions. i am very glad with the Q&A examine resources furnished to me for my private instruction. I think its far the ever nice cloth for the safe practise. thanks team.

You simply want a weekend to prepare P8010-004 examination with those dumps.

are you able to scent the candy fragrance of victory I understand im able to and it is definitely a very beautiful odor. you could smell it too in case you go browsing to this Killexams with a purpose to prepare on your P8010-004 check. I did the identical factor right earlier than my take a look at and was very happy with the carrier furnished to me. The facilitiesright here are impeccable and when you are in it you wouldnt be involved approximately failing in any respect. I didnt fail and did pretty well and so can you. attempt it!

wherein can i am getting understanding of P8010-004 exam?

Being a community expert, I notion appearing for P8010-004 exam would possibly actually assist me in my career. However, because of time restrains schooling for the examination have become absolutely hard for me. I was searching out a observe guide that might make matters higher for me. Killexams Q&A dumps labored like wonders for me as that could be a clinical answer for extra unique take a look at. Abruptly, with its assist, I managed to finish the exam in most effective 70 minutes it is virtually a stunning. Thanks to Killexams materials.

Exactly same questions in real test, WTF!

As I am into the IT field, the P8010-004 exam was vital for me to show up, yet time limitations made it overwhelming for me to work well. I alluded to the Killexams Dumps with 2 weeks to strive for the exam. I figured out how to finish all the inquiries well under due time. The easy to retain solutions make it well simpler to get ready. It worked like a complete reference aide and I was flabbergasted with the result.

these P8010-004 questions and answers provide proper expertise of subjects.

Have passed P8010-004 exam with Killexams questions answers. Killexams is 100% reliable, most of the questions were the same as what I got on the exam. I missed a few questions just because I went blank and didnt remember the answer given in the set, but since I got the rest right, I passed with good scores. So my advice is to learn everything you get in your preparation pack from Killexams, this is all you need to pass P8010-004.

fantastic source of tremendous latest dumps, accurate solutions.

That is to inform that I passed P8010-004 examination the other day. This Killexams questions solutions and exam simulator changed into very useful, and i dont assume i would have finished it without it, with best per week of training. The P8010-004 questions are real, and that is exactly what I saw in the take a look at center. Moreover, this prep corresponds with all of the key issues of the P8010-004 examination, so i was truely organized for a few questions that have been barely special from what Killexams furnished, however at the identical subject matter. But, I passed P8010-004 and satisfiedapproximately it.

You just need a weekend to prepare P8010-004 exam with these dumps.

I surpassed the P8010-004 exam today and scored 100%! never idea I should do it, but Killexams grew to become out to be a gem in exam practise. I had a great feeling approximately it because it seemed to cover all topics, and there have beenlots of questions furnished. yet, I didnt assume to see all of the identical questions in the real exam. Very first-ratesurprise, and that i fantastically advise the usage of Killexams.

Where can I find study guide for good knowledge of P8010-004 exam?

The Practice exam is excellent, I passed P8010-004 paper with a score of 100 percent. Well worth the cost. I will be back for my next certification. First of all let me give you a big thanks for giving me prep dumps for P8010-004 exam. It was indeed helpful for the preparation of exams and also clearing it. You wont believe that i got not a single answer wrong !!!Such comprehensive exam preparatory material are excellent way to score high in exams.

Review Complete Testimonials »

See more IBM exam dumps

Direct Downloads Here   |   View Vendors, Latest Home

Real Exam Questions and Answers of exams

We offer a huge collection of IBM exam questions and answers, study guides, practice exams, Exam Simulator.

000-M49 | 00M-665 | 000-N33 | 000-257 | C2040-409 | 000-057 | C2150-508 | C2010-597 | M2040-669 | M2065-741 | 000-113 | C2040-411 | COG-632 | M2060-729 | 000-599 | 000-025 | 000-M93 | 000-003 | 000-081 | MSC-331 | C2170-051 | 000-041 | 000-052 | 000-775 | C2150-200 | 000-G40 | 000-651 | 000-913 | M2050-242 | 000-M71 | 000-M11 | 000-534 | 00M-235 | C9560-040 | 000-484 | A2090-612 | 000-833 | 00M-233 | 000-350 | C4040-252 | 000-N27 | 000-N20 | A2150-006 | 000-385 | P2090-086 | 000-258 | 000-579 | M2140-648 | 000-580 | 00M-670 |

View Complete IBM Collection »

Latest Exams added

Recently Updated Here   |   View Vendors, Latest Home

Latest Practice Exam Questions and Answers Added to Killexams.com

We keep our visitors and customers updated regarding the latest technology certifications by providing reliable and authentic exam preparation material. Our team remain busy in updating P8010-004 exam training material as well as reviewing the real exam changes. They try best to provide each and every relevant information about the test for the candidate to get good marks and come out of test center happily.

1Y0-340 | 1Z0-324 | 1Z0-344 | 1Z0-346 | 1Z0-813 | 1Z0-900 | 1Z0-935 | 1Z0-950 | 1Z0-967 | 1Z0-973 | 1Z0-987 | A2040-404 | A2040-918 | AZ-101 | AZ-102 | AZ-200 | AZ-300 | AZ-301 | FortiSandbox | HP2-H65 | HP2-H67 | HPE0-J57 | HPE6-A47 | JN0-662 | MB6-898 | ML0-320 | NS0-159 | NS0-181 | NS0-513 | PEGACPBA73V1 | 1Z0-628 | 1Z0-934 | 1Z0-974 | 1Z0-986 | 202-450 | 500-325 | 70-537 | 70-703 | 98-383 | 9A0-411 | AZ-100 | C2010-530 | C2210-422 | C5050-380 | C9550-413 | C9560-517 | CV0-002 | DES-1721 | MB2-719 | PT0-001 | CPA-REG | CPA-AUD | AACN-CMC | AAMA-CMA | ABEM-EMC | ACF-CCP | ACNP | ACSM-GEI | AEMT | AHIMA-CCS | ANCC-CVNC | ANCC-MSN | ANP-BC | APMLE | AXELOS-MSP | BCNS-CNS | BMAT | CCI | CCN | CCP | CDCA-ADEX | CDM | CFSW | CGRN | CNSC | COMLEX-USA | CPCE | CPM | CRNE | CVPM | DAT | DHORT | CBCP | DSST-HRM | DTR | ESPA-EST | FNS | FSMC | GPTS | IBCLC | IFSEA-CFM | LCAC | LCDC | MHAP | MSNCB | NAPLEX | NBCC-NCC | NBDE-I | NBDE-II | NCCT-ICS | NCCT-TSC | NCEES-FE | NCEES-PE | NCIDQ-CID | NCMA-CMA | NCPT | NE-BC | NNAAP-NA | NRA-FPM | NREMT-NRP | NREMT-PTE | NSCA-CPT | OCS | PACE | PANRE | PCCE | PCCN | PET | RDN | TEAS-N | VACC | WHNP | WPT-R | 156-215-80 | 1D0-621 | 1Y0-402 | 1Z0-545 | 1Z0-581 | 1Z0-853 | 250-430 | 2V0-761 | 700-551 | 700-901 | 7765X | A2040-910 | A2040-921 | C2010-825 | C2070-582 | C5050-384 | CDCS-001 | CFR-210 | NBSTSA-CST | E20-575 | HCE-5420 | HP2-H62 | HPE6-A42 | HQT-4210 | IAHCSMM-CRCST | LEED-GA | MB2-877 | MBLEX | NCIDQ | VCS-316 | 156-915-80 | 1Z0-414 | 1Z0-439 | 1Z0-447 | 1Z0-968 | 300-100 | 3V0-624 | 500-301 | 500-551 | 70-745 | 70-779 | 700-020 | 700-265 | 810-440 | 98-381 | 98-382 | 9A0-410 | CAS-003 | E20-585 | HCE-5710 | HPE2-K42 | HPE2-K43 | HPE2-K44 | HPE2-T34 | MB6-896 | VCS-256 | 1V0-701 | 1Z0-932 | 201-450 | 2VB-602 | 500-651 | 500-701 | 70-705 | 7391X | 7491X | BCB-Analyst | C2090-320 | C2150-609 | IIAP-CAP | CAT-340 | CCC | CPAT | CPFA | APA-CPP | CPT | CSWIP | Firefighter | FTCE | HPE0-J78 | HPE0-S52 | HPE2-E55 | HPE2-E69 | ITEC-Massage | JN0-210 | MB6-897 | N10-007 | PCNSE | VCS-274 | VCS-275 | VCS-413 |

View Complete List »

See more braindumps

Direct Downloads Here   |   View Vendors, Latest Home

Actual Test Questions and Answers of exams

Here are some exams that you can explore by clicking the link below. There are thousands of exams that we provide to our candidates covering almost all the areas of certifications. Prepare our Questions and Answers and you will Pass4sure.

7120X | 3203 | 7230X | 7141X | 1D0-61B | 156-210 | 000-N03 | 1Z0-584 | 640-692 | 2B0-020 | 920-130 | 000-935 | 650-286 | 70-566-CSharp | A2180-188 | 220-901 | 70-566-CSharp | PET | TB0-113 | 000-M43 | 6207-1 | A2180-607 | A2040-988 | 000-558 | 1T6-510 | 70-526-CSharp | 1D0-437 | 000-778 | 70-473 | 250-252 | HP0-785 | 7230X | 000-012 | 250-505 | P2150-739 | 640-461 | 000-424 | A2040-913 | 000-191 | HP0-823 | HP0-787 | 050-701 | C9550-606 | 000-703 | BH0-004 | M2040-642 | MSC-122 | SC0-501 | HP0-262 | 050-886 |

Read more Details »

Top of the list Vendors

Certification Vendors Here   |   View Exams, Latest Home

Industry Leading Vendors

Top notch vendors that dominate the entire world market by their technology and experties. We try to cover almost all the technology vendors and their certification areas so that our customers and visitors obtain all the information about test at one place.

BlackBerry | QAI | IEEE | Cloudera | Genesys | CIDQ | PRMIA | Fortinet | ICDL | IRS | PostgreSQL-CE | ASTQB | ICAI | NCEES | 3COM | DRI | PMI | Cisco | IAHCSMM | Real Estate | ACI | H3C | Food | Arizona-Education | NCIDQ | Hospitality | ECCouncil | Apple | PTCB | The-Open-Group | AccessData | Cognos | GRE | Enterasys | SpringSource | APTUSC | BlueCoat | SCO | Healthcare | IBM | FSMTB | ASQ | RES | APICS | ACSM | Banking | Trainers | CIPS | LSAT | Alfresco |

View Complete List »

P8010-004 Sample Questions

Certification Vendors Here   |   View Exams, Latest Home

P8010-004 Demo and Sample

Note: Answers are below each question.
Samples are taken from full version.

Pass4sure P8010-004 dumps | Killexams.com P8010-004 real questions | [HOSTED-SITE]



Killexams.com P8010-004 Dumps and Real Questions

100% Real Questions - Exam Pass Guarantee with High Marks - Just Memorize the Answers



P8010-004 exam Dumps Source : IBM Commerce Solutions Order Mgmt Technical Mastery Test v1

Test Code : P8010-004
Test Name : IBM Commerce Solutions Order Mgmt Technical Mastery Test v1
Vendor Name : IBM
Q&A : 30 Real Questions

What do you mean with the resource today's P8010-004 examination dumps?
after I had taken the selection for going to the exam then I were given an amazing support for my education from the killexams.com which gave me the realness and reliable practice P8010-004 prep classes for the equal. here, I additionally were given the possibility to get myself checked earlier than feeling confident of appearing nicely within the way of the preparing for P8010-004 and that changed into a pleasing thing which made me best equipped for the examination which I scored nicely. thanks to such matters from the killexams.


Belive me or no longer! This resource trendy P8010-004 questions works.
properly, I did it and i cant trust it. I should in no way have passed the P8010-004 without your help. My rating became so high i was amazed at my performance. Its simply because of you. thank you very a whole lot!!!


P8010-004 certification exam is quite traumatic.
The killexams.Com is the awesome web page where my goals come real. With the aid of manner of the usage of the Q&a fabric for the practise certainly introduced the real spark to the studies and severely ended up by way of the usage of acquiring the qualityrating within the P8010-004 exam. Its miles pretty easy to stand any examination with the assist of your have a study material. Thank youplenty for all. Keep up the first-rate art work guys.


Do you know the fastest way to pass P8010-004 exam? I've got it.
passed the P8010-004 exam with 99% marks. awesome! thinking about most effective 15 days education time. All credit score goes to the question & answer by way of killexams. Its great material made training so smooth that I may want to even understand the hard subjects cozy. thanks a lot, killexams.com for offering us such an clean and powerful observe manual. wish your team maintain on developing greater of such courses for other IT certification exams.


am i able to locate phone number of P8010-004 licensed?
Im very happy to have found killexams.com on-line, and even more happy that i purchased P8010-004 package deal certainly days before my examination. It gave the high-quality education I needed, when you consider that I didnt have a good deal time to spare. The P8010-004 attempting out engine is truly proper, and the whole thing targets the areas and questions they test at some point of the P8010-004 exam. It can appear remarkable to pay for a draindump in recent times, at the same time as you can find out nearlywhatever without cost on-line, but agree with me, this one is nicely worth every penny! Im very happy - both with the education technique or even more so with the end result. I surpassed P8010-004 with a completely strong score.


Where can I download P8010-004 latest dumps?
I were given an first-rate cease result with this package. Amazing outstanding, questions are accurate and i had been given maximum of them at the examination. After ive exceeded it, I advocated killexams.com to my colleagues, and all and sundry exceeded their tests, too (some of them took Cisco assessments, others did Microsoft, VMware, and many others). I have not heard a awful assessment of killexams.com, so this must be the tremendous IT education you could currently find on line.


P8010-004 exam isn't any greater tough with these QAs.
I exceeded the P8010-004 examination last week and fully trusted this sell off from killexams.com for my coaching. That could be a fantasticmanner to get licensed as come what may additionally the questions come from the actual pool of exam questions used by supplier. This manner, almost all questions I were given at the examination appeared acquainted, and that i knew answers to them. This is very reliable and sincere, specifically given their money again guarantee (i have a pal who come what may additionally failed an Architect degree examination and were given his cash once more, so this is for real).


Do you want dumps of P8010-004 examination to pass the examination?
killexams.com works! I passed this exam very last fall and at that point over 90% of the questions had been in realitylegitimate. Theyll be rather likely to nonetheless be legitimate as killexams.com cares to replace their substances regularly. killexams.com is a great enterprise which has helped me greater than once. Im a regular, so hoping for cut price for my next package deal!


it's miles first-rate best to put together P8010-004 examination with ultra-cutting-cuttingmodern dumps.
I prepared the P8010-004 exam with the help of killexams.com IBM test preparation material. it was complicated but overall very helpful in passing my P8010-004 exam.


So easy questions in P8010-004 exam! I was already enough prepared.
I passed every week ago my P8010-004 affirmation test. Killexams.Com Q&A and exam Simulator are exceptional item to shop for, it clear my situation matters effects in a definitely time, i used to be stun to understand how terrific they are at their administrations. Identity want an excessive amount of obliged regarding the notable item that you in reality have that aided in the association and the use of the take a look at. That is frequently out and away the maximum wonderful thorough and well little bit of composing. Lots obliged


IBM IBM Commerce Solutions Order

IBM (IBM) Down 10.three% in view that remaining earnings document: Can It Rebound? | killexams.com Real Questions and Pass4sure dumps

A month has gone by given that the remaining profits record for IBM (IBM). Shares have misplaced about 10.3% in that time body, underperforming the S&P 500.

Will the contemporary terrible fashion continue leading as much as its next income unencumber, or is IBM due for a breakout? before we dive into how traders and analysts have reacted as of late, let's take a brief appear on the most fresh earnings file with a purpose to get a better handle on the important catalysts.

IBM’s Q2 results benefit from charge chopping, lessen Share count number

IBM reported third-quarter 2018 non-GAAP profits of $3.forty two per share, which beat the Zacks Consensus Estimate via couple of cents. income per share (EPS) multiplied four.9% from the yr-in the past quarter.

The year-over-12 months increase in EPS can be attributed to solid pre-tax margin working leverage (28 cents contribution) and aggressive share buybacks (19 cents contribution). This became partly offset through reduce revenues (seven cents terrible have an impact on) and higher tax cost (17 cents negative influence).

Revenues of $18.76 billion lagged the Zacks Consensus Estimate of $19.10 billion and declined 2.1% on a 12 months-over-yr foundation. At constant foreign money (cc), revenues remained flat.

IBM mentioned that signings plunged 21% to $eight billion. features backlog declined three% from the yr-ago quarter to $113 billion.

Geographic revenue particulars

Revenues from Americas inched up 1%, pushed with the aid of continued increase in Canada and Latin the united states and modest growth in the u.s..

Europe, center-East and Africa reduced 2% from the yr-in the past quarter, driven through decline in Germany and France, partially offset by means of boom in Spain and the UK.

Asia-Pacific revenues declined 1% on a year-over-year basis with modest growth in Japan.

Strategic Imperatives boom Continues

Strategic Imperatives (cloud, analytics, mobility and protection) grew 7% at cc from the year-ago quarter to $9.3 billion. safety revenues surged 34%. On a trailing 12-month foundation, Strategic Imperatives revenues were $39.5 billion, up 13% (11% at cc).

Cloud revenues surged 13% from the yr-ago quarter to $four.6 billion. The annual run rate for cloud as-a-provider revenues multiplied 24% at cc on a 12 months-over-12 months foundation to $eleven.4 billion.

Cloud revenues of $19 billion on a trailing 12-month foundation elevated 20% (18% at cc) and now accounts for 24% of IBM’s total revenues.

Cognitive Revenues Decline

Cognitive options’ revenues-exterior reduced 5.7% year over year (down 5% at cc) to $four.15 billion. Segmental revenues touching on Strategic Imperatives and Cloud declined four% and a couple of%, respectively. Cloud as-a-carrier revenue annual run rate became $2 billion.

options utility comprises choices in strategic verticals like fitness, area-particular capabilities like analytics and protection, and IBM’s emerging technologies of AI and blockchain. The segment also contains choices that address horizontal domains like collaboration, commerce and ability. solutions utility revenues reduced 3% year over 12 months in the quarter.

IBM brought up that in commerce area the infusion of AI into choices like customer journey analytics helped SaaS signings to grow double digit in the quarter. The fresh launch of Notes Domino edition 10, which is optimized for cellular, and supports JavaScript and node.js will boost increase collaboration in 2019.

Transaction Processing utility contains software that runs mission-essential workloads, leveraging IBM’s hardware platforms. Revenues fell 8% on a yr-over-12 months foundation.

IBM witnessed increase in trade verticals like fitness, key areas of analytics and safety in the quarter. Watson health witnessed extensive-based mostly boom in Payer, provider, Imaging and existence Sciences domains.

all the way through the quarter, the Sugar.IQ utility, developed through Medtronic in partnership with IBM, hit the market. The software is designed to simplify and enhance every day diabetes management.

IBM brought up that analytics performed well within the quarter, driven with the aid of statistics science offerings and IBM Cloud deepest for facts providing.

during the quarter, the company introduced bias detection capabilities and launched new Watson features on the IBM Cloud deepest platform.

safety growth changed into pushed by means of offerings in orchestration, data protection and endpoint management.

In blockchain, IBM meals believe community for food safeguard went live in the quarter. Reatiler Carrefour joined IBM’s blockchain network. The company additionally collectively introduced TradeLens with Maersk that addresses inefficiencies in the world give chain. IBM currently supports 75 active blockchain networks.

international business capabilities Revenues boost

Revenues from world business functions-external segment had been $4.13 billion, up 0.9% from the 12 months-in the past quarter (up three% at cc). Segmental revenues referring to Strategic Imperatives grew 9%. Cloud apply surged 18%. Cloud as-a-service income annual run cost became $1.9 billion.

software management revenues declined 1% from the year-in the past quarter. despite the fact, global method functions revenues climbed 2%. moreover, Consulting revenues elevated 7% yr over yr, pushed via strong efficiency from IBM’s digital enterprise.

expertise features & Cloud structures: Revenues Dip

Revenues from expertise services & Cloud structures-exterior lowered 2% from the 12 months-in the past quarter (flat at cc) to $8.29 billion. Segmental revenues concerning Strategic Imperatives advanced sixteen%, pushed with the aid of hybrid cloud capabilities. Cloud surged 22% from the year-in the past quarter. Cloud as-a-provider revenue annual run fee changed into $7.5 billion.

Integration utility increased 1% from the 12 months-in the past quarter. all through the quarter, ninety five organizations worldwide chosen IBM Cloud private offering. Infrastructure services revenues additionally improved 1% on a year-over-12 months basis.

despite the fact, Technical help features revenues diminished three% from the year-ago quarter.

vigor & z14 drive systems Revenues

systems revenues multiplied 0.9% on a year-over-yr groundwork (up 2% at cc) to $1.seventy four billion. Segmental revenues relating Strategic Imperatives surged 5%, whereas Cloud revenues declined eight%.

IBM Z revenues extended 6% yr over year on greater than 20% MIPS increase, pushed by way of vast-based adoption of the z14 mainframe.

vigor revenues multiplied 17% from the 12 months-in the past quarter. right through the quarter, IBM launched its subsequent technology POWER9 processors for midrange and excessive-conclusion techniques which are designed for dealing with superior analytics, cloud environments and facts-intensive workloads in AI, HANA, and UNIX markets.

IBM also brought new offerings optimizing both hardware and software for AI. management believes that items like PowerAI imaginative and prescient and PowerAI commercial enterprise will help pressure new client adoption.

youngsters, storage hardware revenues declined 6% because of vulnerable performance within the midrange and excessive conclusion, partially offset with the aid of mighty growth in All Flash Arrays. IBM mentioned that pricing power within the immensely aggressive storage market is hurting revenues. The company announced its new FlashSystems with subsequent technology NVMe technology all through the quarter.

working techniques application revenues declined 4%, while systems Hardware superior four% from the year-in the past quarter.

ultimately, world Financing (includes financing and used machine revenue) revenues decreased 9.1% at cc to $388 million.

operating details

Non-GAAP gross margin remained unchanged from the yr-in the past quarter at 47.four%. This turned into IBM’s premier gross margin efficiency in years and changed into essentially pushed by a hundred and sixty basis aspects (bps) growth in services margin. despite the fact, unfavorable combine in z14 mainframe and utility wholly offset this enlargement.

working rate declined 4% 12 months over year, as a result of consciousness of acquisition synergies and enhancing operational efficiencies. IBM continues to invest in speedy starting to be fields like hybrid cloud, synthetic intelligence (AI), protection and blockchain.

Pre-tax margin from continuing operations multiplied 50 bps on a year-over-12 months groundwork to 19.2%.

Cognitive solutions and world enterprise features segment pre-tax margins improved one hundred ninety bps and 320 bps, respectively, on a 12 months-over-yr basis. youngsters, technology capabilities & Cloud structures phase pre-tax margin reduced in size one hundred bps.

methods pre-tax income became $209 million down 38% 12 months over yr. world Financing segment pre-tax earnings jumped 26.7% to $308 million.

balance Sheet & cash stream details

IBM ended third-quarter 2018 with $14.70 billion in complete money and marketable securities in comparison with $11.ninety three billion at the conclusion of 2d-quarter 2018. complete debt (together with world financing) was $46.9 billion, up $1.four million from the previous quarter.

IBM reported money stream from operations (except for global Financing receivables) of $3.1 billion and generated free money flow of $2.2 billion in the quarter.

within the pronounced quarter, the business back $2.1 billion to shareholders via dividends and share repurchases. on the end of the quarter, the enterprise had $1.four billion closing under current buyback authorization.

suggestions

IBM reiterated EPS forecast for 2018. Non-GAAP EPS is anticipated to be at the least $13.eighty.

IBM still anticipates 2018 free money circulate of $12 billion.

Story continues

How Have Estimates Been relocating when you consider that Then?

in the past month, traders have witnessed a downward vogue in sparkling estimates.

VGM ratings

at this time, IBM has a regular boom score of C, though it's lagging a bit of on the Momentum ranking entrance with a D. besides the fact that children, the inventory turned into allotted a grade of A on the price facet, putting it within the appropriate quintile for this funding strategy.

basic, the stock has an aggregate VGM ranking of B. if you aren't focused on one method, this score is the one be sure to be interested in.

Outlook

Estimates have been widely trending downward for the inventory, and the magnitude of those revisions shows a downward shift. chiefly, IBM has a Zacks Rank #3 (cling). We predict an in-line return from the stock within the following couple of months.

want the newest techniques from Zacks investment analysis? nowadays, which you could download 7 surest shares for the subsequent 30 Days. click on to get this free file foreign enterprise Machines organisation (IBM) : Free inventory analysis record To read this text on Zacks.com click right here. Zacks investment research


IBM predicts AI will create a new breed of marketers | killexams.com Real Questions and Pass4sure dumps

as the calendar flips, marketers will seem to be to find new how to profit an facet. As IBM predicts, a new breed of entrepreneurs is rising with the support of synthetic intelligence.

IBM Watson advertising released its 2019 advertising and marketing tendencies file, highlighting traits within the business. The group predicted that in the emotion economic climate, buyers will likely have interaction extra with manufacturers which are genuine and deliver on their convictions.

That may no longer be a new construction by itself, but IBM believes AI and computing device getting to know will make hyper-personalization a truth because the proliferation of facts and the streamlining of marketing stacks will permit marketers to deliver customized content material at a massive scale.

Michael Trapani, marketing software director for IBM Watson advertising, said emotion and personal connection doesn't have to be in battle with the seemingly bloodless and calculating world of AI.

"Making a reference to a company will at all times be a really human, emotion-driven technique for consumers. where AI and computing device getting to know are available is the ability to superior inform entrepreneurs in accordance with uncovering insights about your shoppers that a human could no longer see or locate. these insights then enable human marketers to boost better and more relevant artistic after which deliver it at scale across channels to individual patrons," observed Trapani.

in line with the document, the expected inflow of AI will lead to emergence of 'consulgencies,' as the should construct out abilities in client event analytics and mobile apps will see the capabilities of consultancies and companies converge.

"many of the company partners that work with IBM have added or elevated their technical and facts integration and consulting capabilities. Many are also moving to more of consultative arrangement, focused more on hours and effects than on media buys. As for AI, all agencies regardless of size are exploring uses of AI to solve their customer's advertising and customer challenges, even if it is building chatbots or interactive experiences," Trapani talked about.

"companies are also more and more the use of off-the-shelf AI-based mostly advertising and marketing options that can predict premiere customer journeys, determine purchasers surely to churn, and identify and predict where consumers are struggling to finished goals in an online event."

among the many predictions include the boom of the director of advertising and marketing information function and the emergence of the 'martecheter,' a greater tech-savvy marketer. The document additionally says that, traditionally, the highest quality benefits to a entrepreneurs have been finances, equipment, and talent, but that order will flip as the industry moves far from hiring single-professional marketers given the emphasis on client experience and advertising and marketing technologies.

The internal workings of the entire C-suite might see a transformation, too. according to the file, the focal point on client centricity will create extra alternatives for commerce and digital groups to combination and scan with customer data.


Metro footwear faucets IBM Watson For Digital Commerce | killexams.com Real Questions and Pass4sure dumps

ibm shoes

Metro footwear Ltd, one in every of India’s leading multi-manufacturer sneakers chains, is launching a brand new Digital Commerce platform powered with the aid of Watson client Engagement hosted on IBM Cloud. this could encompass IBM Watson Order management and Commerce for seamless digital engagement. Working with IBM company companion CEBS international, IBM solutions will now not only assist drive sophisticated customer experiences and new ranges of comfort but deliver efficiencies to the supply chain.

With a national footprint of 350 actual showrooms, an expanding manufacturer portfolio and altering customer preferences, Metro shoes Ltd become facing challenges in managing orders coming from multiple online platforms.  earlier dealt with by way of unreliable application, resulting in lack of visibility of true-time records of sales, stock vicinity and returns. apart from its stock management challenges, Metro footwear Ltd crucial to increase online presence for a few of their general interior manufacturers which were getting low visibility impacting universal income.

“know-how is redefining customer engagement and may be the key differentiator for retail brands of the longer term. We’re excited to collaborate with IBM and CEBS to embark on our digital transformation journey,” said Alisha Malik, vice chairman, Digital, Metro footwear. “With IBM’s expertise within the omni-channel commerce and retail house, we are assured that these adjustments will no longer handiest help accelerate the execution of our approach, however also provide us an area over competitors. At Metro shoes, we strongly believe that the new solution will increase the typical user experience, thereby increasing revisits, traffic and loyalty.”

With IBM, Metro shoes Ltd can benefit new levels of consumer perception, which may also be used to customise the on-line experience for each and every vacationer as they navigate in the course of the web page. Delivered via a single platform, Metro shoes can be in a position to exhibit all of its brands and advocate specific gadgets in line with insights shared by using consumers. This customized event will encompass new and convenient fulfillmentoptions corresponding to purchase on-line, pick up in save, reserve in keep and straightforward returns. because of these new capabilities, Metro footwear should be in a position to raise every vacationer’s adventure on the website by way of enabling commerce practitioners with cognitive equipment which help them convey omni-channel experiences that interact consumers and pressure income.

With IBM’s know-how capabilities and CEBS talents with marketplace integration, Metro shoes as a company/seller will also be capable of combine with greater than 14 e-marketplaces like Amazon, Flipkart and different main portals with a centralized method and stock engine to permit Metro to scale as much as the needs of a starting to be market enterprise. further, IBM Cloud will support elevate the ability to configure heavy workloads and thereby carry efficiency required for top usage all the way through the searching season.

speaking concerning the collaboration, Nishant Kalra, enterprise unit leader – IBM Watson consumer Engagement - India/South Asiaadded, “IBM is on the forefront of assisting shoppers embody more recent ways to work and digitally transforming the way they engage with their end shoppers. we are satisfied to be a part of Metro footwear’ digital transformation adventure by means of supplying sophisticated digital commerce adventure, leveraging the retailers by using merging them with on-line, and at last driving manufacturer advocacy. IBM in affiliation with CEBS will enable deep innovation, sooner-go-to-market and streamline processes for scalability.”

The IBM platform will create a bridge between its online and offline business which the retailer prior to now lacked. With the brand new built-in single view, Metro footwear sooner or later should be in a position to use insights received from the digital realm to design particular providing for valued clientele as they walk into any of their retailers. in consequence, they can take note what consumers desire, make certain availability when and where they want it and even study pass promoting and upselling throughout their quite a lot of manufacturers.

For Metro footwear, IBM Watson Order management and Commerce solutions can pave way for IBM’s cognitive applied sciences to carry insights that help them deliver clients with customized options and an better user event –from click to birth.

“With over 15 years of experience in developing e-company equipment, CEBS has been a relied on options provider and companion for companies across the globe,”talked about Satish Swaroop, President, CEBS global. Our useful and versatile utility options paired with IBM’s deep expertise skills will give Metro shoes a true-time, centralized equipment for customer administration.”




Killexams.com P8010-004 Dumps and Real Questions

100% Real Questions - Exam Pass Guarantee with High Marks - Just Memorize the Answers



P8010-004 exam Dumps Source : IBM Commerce Solutions Order Mgmt Technical Mastery Test v1

Test Code : P8010-004
Test Name : IBM Commerce Solutions Order Mgmt Technical Mastery Test v1
Vendor Name : IBM
Q&A : 30 Real Questions

What do you mean with the resource today's P8010-004 examination dumps?
after I had taken the selection for going to the exam then I were given an amazing support for my education from the killexams.com which gave me the realness and reliable practice P8010-004 prep classes for the equal. here, I additionally were given the possibility to get myself checked earlier than feeling confident of appearing nicely within the way of the preparing for P8010-004 and that changed into a pleasing thing which made me best equipped for the examination which I scored nicely. thanks to such matters from the killexams.


Belive me or no longer! This resource trendy P8010-004 questions works.
properly, I did it and i cant trust it. I should in no way have passed the P8010-004 without your help. My rating became so high i was amazed at my performance. Its simply because of you. thank you very a whole lot!!!


P8010-004 certification exam is quite traumatic.
The killexams.Com is the awesome web page where my goals come real. With the aid of manner of the usage of the Q&a fabric for the practise certainly introduced the real spark to the studies and severely ended up by way of the usage of acquiring the qualityrating within the P8010-004 exam. Its miles pretty easy to stand any examination with the assist of your have a study material. Thank youplenty for all. Keep up the first-rate art work guys.


Do you know the fastest way to pass P8010-004 exam? I've got it.
passed the P8010-004 exam with 99% marks. awesome! thinking about most effective 15 days education time. All credit score goes to the question & answer by way of killexams. Its great material made training so smooth that I may want to even understand the hard subjects cozy. thanks a lot, killexams.com for offering us such an clean and powerful observe manual. wish your team maintain on developing greater of such courses for other IT certification exams.


am i able to locate phone number of P8010-004 licensed?
Im very happy to have found killexams.com on-line, and even more happy that i purchased P8010-004 package deal certainly days before my examination. It gave the high-quality education I needed, when you consider that I didnt have a good deal time to spare. The P8010-004 attempting out engine is truly proper, and the whole thing targets the areas and questions they test at some point of the P8010-004 exam. It can appear remarkable to pay for a draindump in recent times, at the same time as you can find out nearlywhatever without cost on-line, but agree with me, this one is nicely worth every penny! Im very happy - both with the education technique or even more so with the end result. I surpassed P8010-004 with a completely strong score.


Where can I download P8010-004 latest dumps?
I were given an first-rate cease result with this package. Amazing outstanding, questions are accurate and i had been given maximum of them at the examination. After ive exceeded it, I advocated killexams.com to my colleagues, and all and sundry exceeded their tests, too (some of them took Cisco assessments, others did Microsoft, VMware, and many others). I have not heard a awful assessment of killexams.com, so this must be the tremendous IT education you could currently find on line.


P8010-004 exam isn't any greater tough with these QAs.
I exceeded the P8010-004 examination last week and fully trusted this sell off from killexams.com for my coaching. That could be a fantasticmanner to get licensed as come what may additionally the questions come from the actual pool of exam questions used by supplier. This manner, almost all questions I were given at the examination appeared acquainted, and that i knew answers to them. This is very reliable and sincere, specifically given their money again guarantee (i have a pal who come what may additionally failed an Architect degree examination and were given his cash once more, so this is for real).


Do you want dumps of P8010-004 examination to pass the examination?
killexams.com works! I passed this exam very last fall and at that point over 90% of the questions had been in realitylegitimate. Theyll be rather likely to nonetheless be legitimate as killexams.com cares to replace their substances regularly. killexams.com is a great enterprise which has helped me greater than once. Im a regular, so hoping for cut price for my next package deal!


it's miles first-rate best to put together P8010-004 examination with ultra-cutting-cuttingmodern dumps.
I prepared the P8010-004 exam with the help of killexams.com IBM test preparation material. it was complicated but overall very helpful in passing my P8010-004 exam.


So easy questions in P8010-004 exam! I was already enough prepared.
I passed every week ago my P8010-004 affirmation test. Killexams.Com Q&A and exam Simulator are exceptional item to shop for, it clear my situation matters effects in a definitely time, i used to be stun to understand how terrific they are at their administrations. Identity want an excessive amount of obliged regarding the notable item that you in reality have that aided in the association and the use of the take a look at. That is frequently out and away the maximum wonderful thorough and well little bit of composing. Lots obliged


While it is hard errand to pick solid certification questions/answers assets regarding review, reputation and validity since individuals get sham because of picking incorrectly benefit. Killexams.com ensure to serve its customers best to its assets as for exam dumps update and validity. The greater part of other's sham report objection customers come to us for the brain dumps and pass their exams cheerfully and effortlessly. We never bargain on our review, reputation and quality because killexams review, killexams reputation and killexams customer certainty is imperative to us. Extraordinarily we deal with killexams.com review, killexams.com reputation, killexams.com sham report grievance, killexams.com trust, killexams.com validity, killexams.com report and killexams.com scam. On the off chance that you see any false report posted by our rivals with the name killexams sham report grievance web, killexams.com sham report, killexams.com scam, killexams.com protestation or something like this, simply remember there are constantly terrible individuals harming reputation of good administrations because of their advantages. There are a great many fulfilled clients that pass their exams utilizing killexams.com brain dumps, killexams PDF questions, killexams rehearse questions, killexams exam simulator. Visit Killexams.com, our example questions and test brain dumps, our exam simulator and you will realize that killexams.com is the best brain dumps site.

[OPTIONAL-CONTENTS-2]


1Z0-628 free pdf | HP0-648 bootcamp | P2170-036 VCE | 1Z0-453 dumps | 9L0-613 Practice test | HP2-N37 pdf download | 000-657 free pdf | 000-N27 sample test | C2010-570 practice questions | 1V0-604 real questions | 9A0-502 exam prep | CRA exam prep | HP0-S33 real questions | M2080-713 free pdf | HP0-680 questions and answers | 050-720 brain dumps | 2B0-020 Practice Test | 1Z0-561 study guide | 642-416 braindumps | MB4-219 test prep |


[OPTIONAL-CONTENTS-3]

P8010-004 Real Exam Questions by killexams.com
killexams.com P8010-004 Exam PDF comprises of Complete Pool of Questions and Answers and Dumps checked and affirmed alongside references and clarifications (where applicable). Our objective to accumulate the Questions and Answers isnt in every case just to pass the exam at the first attempt yet Really Improve Your Knowledge about the P8010-004 exam subjects.

At killexams.com, we have an approach to provide fully tested IBM P8010-004 actual Questions and Answers which you will need to Pass P8010-004 exam. we have an approach to truly guide people to organize to remember the Q&A and Guaranteed. It is a good call to hurry up your position as a professional at intervals the business. Click http://killexams.com/pass4sure/exam-detail/P8010-004 We are excited with our infamy of supporting people pass the P8010-004 exam in their first attempt. Our prosperity quotes within the preceding 2 years had been utterly glorious, as a consequence of our cheerful shoppers presently able to impel their professions within the speedy tune. killexams.com is the principle call amongst IT specialists, notably people who hoping to scale the chain of command stages speedier in their respective associations. killexams.com Discount Coupons and Promo Codes are as below; WC2017 : 60% Discount Coupon for all tests on web site PROF17 : 10% Discount Coupon for Orders over $69 DEAL17 : 15% Discount Coupon for Orders additional than $99 SEPSPECIAL : 10% Special Discount Coupon for All Orders You ought to get the foremost updated IBM P8010-004 Braindumps with the proper answers, that are ready by killexams.com professionals, permitting the candidates to understand information regarding their P8010-004 exam course within the most, you will not realize P8010-004 product of such quality anyplace within the market. Our IBM P8010-004 brain Dumps are given to candidates at playacting 100% in their test. Our IBM P8010-004 exam dumps are latest within the market, providing you with an opportunity to organize for your P8010-004 exam within the right means.

The most ideal approach to get achievement in the IBM P8010-004 exam is that you should procure tried and true braindumps. We guarantee that killexams.com is the most direct pathway toward ensuring IBM IBM Commerce Solutions Order Mgmt Technical Mastery Test v1 exam. You will be triumphant with full surety. You can see free questions at killexams.com before you buy the P8010-004 exam items. Our impersonated tests are in various choice the same As the real exam plan. The questions and answers collected by the guaranteed experts. They give you the experience of stepping through the real exam. 100% certification to pass the P8010-004 real test.

killexams.com IBM Certification consider guides are setup by IT specialists. Packs of understudies have been crying that unreasonably various questions in such enormous quantities of preparing exams and study associates, and they are as of late tired to deal with the cost of any more. Seeing killexams.com masters work out this broad shape while still certification that all the data is anchored after significant research and P8010-004 exam. Everything is to make comfort for contenders on their road to P8010-004 accreditation.

We have Tested and Approved P8010-004 Exams. killexams.com gives the correct and latest IT exam materials which basically contain all data centers. With the guide of our P8010-004 brain dumps, you don't need to waste your chance on scrutinizing reference books and just need to consume 10-20 hours to expert our P8010-004 real questions and answers. Likewise, we outfit you with PDF Version and Software Version exam questions and answers. For Software Version materials, Its offered to give indistinguishable experience from the IBM P8010-004 exam in a real situation.

We give free updates. Inside authenticity period, if P8010-004 brain dumps that you have purchased updated, we will suggest you by email to download latest form of Q&A. If you don't pass your IBM IBM Commerce Solutions Order Mgmt Technical Mastery Test v1 exam, We will give you full refund. You need to send the verified copy of your P8010-004 exam report card to us. Consequent to confirming, we will quickly give you FULL REFUND.

If you prepare for the IBM P8010-004 exam using our testing programming. It is definitely not hard to win for all certifications in the primary attempt. You don't need to deal with all dumps or any free deluge/rapidshare all stuff. We offer free demo of each IT Certification Dumps. You can take a gander at the interface, question quality and accommodation of our preparation exams before you buy.

killexams.com Huge Discount Coupons and Promo Codes are as under;
WC2017: 60% Discount Coupon for all exams on website
PROF17: 10% Discount Coupon for Orders greater than $69
DEAL17: 15% Discount Coupon for Orders greater than $99
DECSPECIAL: 10% Special Discount Coupon for All Orders


[OPTIONAL-CONTENTS-4]


Killexams EN0-001 VCE | Killexams 190-832 sample test | Killexams ST0-057 practice test | Killexams 2V0-621 practice test | Killexams HP2-B109 cheat sheets | Killexams 000-587 free pdf | Killexams 190-981 braindumps | Killexams 000-120 free pdf | Killexams HP0-922 brain dumps | Killexams HP0-J40 questions and answers | Killexams 090-554 real questions | Killexams 1Z0-451 dumps | Killexams C9550-606 dumps questions | Killexams HP2-B144 exam prep | Killexams A2180-607 test prep | Killexams LOT-832 dump | Killexams 000-181 practice exam | Killexams CAT-500 braindumps | Killexams HP0-Y17 real questions | Killexams LOT-838 test prep |


[OPTIONAL-CONTENTS-5]

View Complete list of Killexams.com Brain dumps


Killexams 310-100 practice questions | Killexams 70-348 test prep | Killexams 000-038 braindumps | Killexams CCN pdf download | Killexams 920-481 braindumps | Killexams LOT-954 exam prep | Killexams 9A0-029 practice test | Killexams HP2-Z29 brain dumps | Killexams MB2-718 test prep | Killexams 000-753 study guide | Killexams 050-v71-CASECURID02 exam questions | Killexams E20-350 braindumps | Killexams P2080-096 questions and answers | Killexams M2020-626 practice test | Killexams HP0-J35 brain dumps | Killexams F50-506 study guide | Killexams 1Z1-051 dumps questions | Killexams 300-070 dumps | Killexams A2090-544 test questions | Killexams ST0-174 study guide |


IBM Commerce Solutions Order Mgmt Technical Mastery Test v1

Pass 4 sure P8010-004 dumps | Killexams.com P8010-004 real questions | [HOSTED-SITE]

Modeled larval connectivity of a multi-species reef fish and invertebrate assemblage off the coast of Moloka‘i, Hawai‘i | killexams.com real questions and Pass4sure dumps

Introduction

Knowledge of population connectivity is necessary for effective management in marine environments (Mitarai, Siegel & Winters, 2008; Botsford et al., 2009; Toonen et al., 2011). For many species of marine invertebrate and reef fish, dispersal is mostly limited to the pelagic larval life stage. Therefore, an understanding of larval dispersal patterns is critical for studying population dynamics, connectivity, and conservation in the marine environment (Jones, Srinivasan & Almany, 2007; Lipcius et al., 2008; Gaines et al., 2010; Toonen et al., 2011). Many coastal and reef species have a bi-phasic life history in which adults display limited geographic range and high site fidelity, while larvae are pelagic and highly mobile (Thorson, 1950; Scheltema, 1971; Strathmann, 1993; Marshall et al., 2012). This life history strategy is not only common to sessile invertebrates such as corals or limpets; many reef fish species have been shown to have a home range of <1 km as adults (Meyer et al., 2000; Meyer, Papastamatiou & Clark, 2010). Depending on species, the mobile planktonic stage can last from hours to months and has the potential to transport larvae up to hundreds of kilometers away from a site of origin (Scheltema, 1971; Richmond, 1987; Shanks, 2009). Knowledge of larval dispersal patterns can be used to inform effective management, such as marine spatial management strategies that sustain source populations of breeding individuals capable of dispersing offspring to other areas.

Both biological and physical factors impact larval dispersal, although the relative importance of these factors is likely variable among species and sites and remains debated (Levin, 2006; Paris, Chérubin & Cowen, 2007; Cowen & Sponaugle, 2009; White et al., 2010). In situ data on pelagic larvae are sparse; marine organisms at this life stage are difficult to capture and identify, and are typically found in low densities across large areas of the open ocean (Clarke, 1991; Wren & Kobayashi, 2016). A variety of genetic and chemistry techniques have therefore been developed to estimate larval connectivity (Gillanders, 2005; Leis, Siebeck & Dixson, 2011; Toonen et al., 2011; Johnson et al., 2018). Computer models informed by field and laboratory data have also become a valuable tool for estimating larval dispersal and population connectivity (Paris, Chérubin & Cowen, 2007; Botsford et al., 2009; Sponaugle et al., 2012; Kough, Paris & Butler IV, 2013; Wood et al., 2014). Individual-based models, or IBMs, can incorporate both biological and physical factors known to influence larval movement. Pelagic larval duration (PLD), for example, is the amount of time a larva spends in the water column before settlement and can vary widely among or even within species ( Toonen & Pawlik, 2001). PLD affects how far an individual can be successfully transported by ocean currents, and so is expected to directly affect connectivity patterns (Siegel et al., 2003; Shanks, 2009; Dawson et al., 2014). In addition to PLD, adult reproductive strategy and timing (Carson et al., 2010; Portnoy et al., 2013), fecundity (Castorani et al., 2017), larval mortality (Vikebøet al., 2007), and larval developmental, morphological, and behavioral characteristics (Paris, Chérubin & Cowen, 2007) may all play a role in shaping connectivity patterns. Physical factors such as temperature, bathymetry, and current direction can also substantially influence connectivity (Cowen & Sponaugle, 2009). In this study, we incorporated both biotic and abiotic components in an IBM coupled with an oceanographic model to predict fine-scale patterns of larval exchange around the island of Moloka‘i in the Hawaiian archipelago.

The main Hawaiian Islands are located in the middle of the North Pacific Subtropical Gyre, and are bordered by the North Hawaiian Ridge current along the northern coasts of the islands and the Hawaii Lee Current along the southern coasts, both of which run east to west and are driven by the prevailing easterly trade winds (Lumpkin, 1998; Friedlander et al., 2005). The Hawai‘i Lee Countercurrent, which runs along the southern perimeter of the chain, flows west to east (Lumpkin, 1998). The pattern of mesoscale eddies around the islands is complex and varies seasonally (Friedlander et al., 2005; Vaz et al., 2013).

Hawaiian marine communities face unprecedented pressures, including coastal development, overexploitation, disease, and increasing temperature and acidification due to climate change (Smith, 1993; Lowe, 1995; Coles & Brown, 2003; Friedlander et al., 2003; Friedlander et al., 2005; Aeby, 2006). Declines in Hawaiian marine resources argue for implementation of a more holistic approach than traditional single-species maximum sustainable yield techniques, which have proven ineffective (Goodyear, 1996; Hilborn, 2011). There is a general movement toward the use of ecosystem-based management, which requires knowledge of ecosystem structure and connectivity patterns to establish and manage marine spatial planning areas (Slocombe, 1993; Browman et al., 2004; Pikitch et al., 2004; Arkema, Abramson & Dewsbury, 2006). Kalaupapa National Historical Park is a federal marine protected area (MPA) located on the north shore of Moloka‘i, an island in the Maui Nui complex of the Hawaiian archipelago, that includes submerged lands and waters up to 1 4 mile offshore (NOAA, 2009). At least five IUCN red-listed coral species have been identified within this area (Kenyon, Maragos & Fenner, 2011), and in 2010 the Park showed the greatest fish biomass and species diversity out of four Hawaiian National Parks surveyed (Beets, Brown & Friedlander, 2010). One of the major benefits expected of MPAs is that the protected waters within the area provide a source of larval spillover to other sites on the island, seeding these areas for commercial, recreational, and subsistence fishing (McClanahan & Mangi, 2000; Halpern & Warner, 2003; Lester et al., 2009).

In this study, we used a Lagrangian particle-tracking IBM (Wong-Ala et al., 2018) to simulate larval dispersal around Moloka‘i and to estimate the larval exchange among sites at the scale of an individual island. We have parameterized our model with biological data for eleven species covering a breadth of Hawaiian reef species life histories (e.g., habitat preferences, larval behaviors, and pelagic larval durations, Table 1), and of interest to both the local community and resource managers. Our goals were to examine patterns of species-specific connectivity, characterize the location and relative magnitude of connections around Moloka‘i, describe sites of potential management relevance, and address the question of whether Kalaupapa National Historical Park provides larval spillover for adjacent sites on Moloka‘i, or connections to the adjacent islands of Hawai‘i, Maui, O‘ahu, Lana‘i, and Kaho‘olawe.

Table 1:

Target taxa selected for the study, based on cultural, ecological, and/or economic importance.

PLD = pelagic larval duration. Short dispersers (3–25 day minimum PLD) in white, medium dispersers (30–50 day minimum PLD) in light gray, and long dispersers (140–270 day minimum PLD) in dark gray. Spawn season and timing from traditional ecological knowledge shared by cultural practitioners on the island. Asterisk indicates that congener-level data was used. Commonname Scientific name Spawn type # of larvae spawned Spawningday of year Spawning hour of day Spawning moon phase Larval depth (m) PLD (days) Habitat ’Opihi/ Limpet Cellana spp. Broadcast1 861,300 1–60 & 121–181 – New 0–5 3–181,2 Intertidal1 Ko’a/ Cauliflower coral Pocillopora meandrina Broadcast3 1,671,840 91–151 07:15–08:00 Full 0–54 5–90*5 Reef He’e/ Octopus Octopus cyanea Benthic6 1,392,096 1–360 – – 50–100 216 Reef, rubble7 Moi/ Pacific threadfin Polydactylus sexfilis Broadcast 1,004,640 152–243 – – 50–1008 259 Sand10 Uhu uliuli/ Spectacled parrotfish Chlorurus perspicillatus Broadcast 1,404,792 152–212 – – 0–120*11 30*12 Reef10 Uhu palukaluka/ Reddlip parrotfish Scarus rubroviolaceus Broadcast 1,404,792 152–212 – – 0–120*11 30*12 Rock, reef10 Kumu/ Whitesaddle Goatfish Parupeneus porphyreus Broadcast 1,071,252 32–90 – – 0–50*11 41–56*12 Sand, rock, reef10 Kole/ Spotted surgeonfish Ctenochaetus strigosus Broadcast 1,177,200 60–120 – – 50–10011 50*12 Rock, reef, rubble10 ‘Ōmilu/ Bluefin trevally Caranx melampygus Broadcast 1,310,616 121–243 – – 0–80*11 140*13,14 Sand, reef10 Ulua/ Giant trevally Caranx ignoblis Broadcast 1,151,040 152–243 – Full 0–80*11 14013,14 Sand, rock, reef10 Ula/ Spiny lobster Panulirus spp. Benthic15 1,573,248 152–243 – – 50–10016 27017 Rock, pavement16 Methods Circulation model

We selected the hydrodynamic model MITgcm, which is designed for the study of dynamical processes in the ocean on a horizontal scale. This model solves incompressible Navier–Stokes equations to describe the motion of viscous fluid on a sphere, discretized using a finite-volume technique (Marshall et al., 1997). The one-km resolution MITgcm domain for this study extends from 198.2°E to 206°E and from 17°N to 22.2°N, an area that includes the islands of Moloka‘i, Maui, Lana‘i, Kaho‘olawe, O‘ahu, and Hawai‘i. While Ni‘ihau and southern Kaua’i also fall within the domain, we discarded connectivity to these islands because they lie within the 0.5° boundary zone of the current model. Boundary conditions are enforced over 20 grid points on all sides of the model domain. Vertically, the model is divided into 50 layers that increase in thickness with depth, from five m at the surface (0.0–5.0 m) to 510 m at the base (4,470 –4,980 m). Model variables were initialized using the output of a Hybrid Coordinate Ocean Model (HYCOM) at a horizontal resolution of 0.04° (∼four km) configured for the main Hawaiian Islands, using the General Bathymetric Chart of the Oceans database (GEBCO, 1/60°) (Jia et al., 2011).

The simulation runs from March 31st, 2011 to July 30th, 2013 with a temporal resolution of 24 h and shows seasonal eddies as well as persistent mesoscale features (Fig. S1). We do not include tides in the model due to temporal resolution. Our model period represents a neutral ocean state; no El Niño or La Niña events occurred during this time period. To ground-truth the circulation model, we compared surface current output to real-time trajectories of surface drifters from the GDP Drifter Data Assembly Center (Fig. S2) (Elipot et al., 2016), as well as other current models of the area (Wren et al., 2016; Storlazzi et al., 2017).

Biological model

To simulate larval dispersal, we used a modified version of the Wong-Ala et al. (2018) IBM, a 3D Lagrangian particle-tracking model written in the R programming language (R Core Team, 2017). The model takes the aforementioned MITgcm current products as input, as well as shoreline shapefiles extracted from the full resolution NOAA Global Self-consistent Hierarchical High-resolution Geography database, v2.3.0 (Wessel & Smith, 1996). Our model included 65 land masses within the geographic domain, the largest being the island of Hawai‘i and the smallest being Pu‘uki‘i Island, a 1.5-acre islet off the eastern coast of Maui. To model depth, we used the one arc-minute-resolution ETOPO1 bathymetry, extracted using the R package ‘marmap’ (Amante & Eakins, 2009; Pante & Simon-Bouhet, 2013).

Each species was simulated with a separate model run. Larvae were modeled from spawning to settlement and were transported at each timestep (t = 2 h) by advection-diffusion transport. This transport consisted of (1) advective displacement caused by water flow, consisting of east (u) and north (v) velocities read from daily MITgcm files, and (2) additional random-walk displacement, using a diffusion constant of 0.2 m2/s−1 (Lowe et al., 2009). Vertical velocities (w) were not implemented by the model; details of vertical larval movement are described below. Advection was interpolated between data points at each timestep using an Eulerian 2D barycentric interpolation method. We chose this implementation over a more computationally intensive interpolation method (i.e., fourth-order Runge–Kutta) because we did not observe a difference at this timestep length. Biological processes modeled include PLD, reproduction timing and location, mortality, and ontogenetic changes in vertical distribution; these qualities were parameterized via species-specific data obtained from previous studies and from the local fishing and management community (Table 1).

Larvae were released from habitat-specific spawning sites and were considered settled if they fell within a roughly one-km contour around reef or intertidal habitat at the end of their pelagic larval duration. Distance from habitat was used rather than water depth because Penguin Bank, a relatively shallow bank to the southwest of Moloka‘i, does not represent suitable habitat for reef-associated species. PLD for each larva was a randomly assigned value between the minimum and maximum PLD for that species, and larvae were removed from the model if they had reached their PLD and were not within a settlement zone. No data on pre-competency period were available for our study species, so this parameter was not included. Mortality rates were calculated as larval half-lives; e.g., one-half of all larvae were assumed to have survived at one-half of the maximum PLD for that species (following Holstein, Paris & Mumby, 2014). Since our focus was on potential connectivity pathways, reproductive rates were calibrated to allow for saturation of possible settlement sites, equating from ∼900,000 to ∼1,7000,000 larvae released depending on species. Fecundity was therefore derived not from biological data, but from computational minimums.

Development, and resulting ontogenetic changes in behavior, is specific to the life history of each species. Broadcast-spawning species with weakly-swimming larvae (P. meandrina and Cellana spp., Table 1) were transported as passive particles randomly distributed between 0–5 m depth (Storlazzi, Brown & Field, 2006). Previous studies have demonstrated that fish larvae have a high degree of control over their vertical position in the water column (Irisson et al., 2010; Huebert, Cowen & Sponaugle, 2011). Therefore, we modeled broadcast-spawning fish species with a 24-hour passive buoyant phase to simulate eggs pre-hatch, followed by a pelagic larval phase with a species-specific depth distribution. For C. ignoblis, C. melampygus, P. porphyreus, C. perspicillatus, and S. rubroviolaceus, we used genus-level depth distributions (Fig. S3) obtained from the 1996 NOAA ichthyoplankton vertical distributions data report (Boehlert & Mundy, 1996). P. sexfilis and C. strigosus larvae were randomly distributed between 50–100 m (Boehlert, Watson & Sun, 1992). Benthic brooding species (O. cyanea and Panulirus spp.) do not have a passive buoyant phase, and thus were released as larvae randomly distributed between 50–100 m. At each time step, a larva’s depth was checked against bathymetry, and was assigned to the nearest available layer if the species-specific depth was not available at these coordinates.

For data-poor species, we used congener-level estimates for PLD (see Table 1). For example, there is no estimate of larval duration for Caranx species, but in Hawai‘i peak spawning occurs in May–July and peak recruitment in August–December (Sudekum, 1984; Longenecker, Langston & Barrett, 2008). In consultation with resource managers and community members, a PLD of 140 days was chosen pending future data that indicates a more accurate pelagic period.

Habitat selection

Spawning sites were generated using data from published literature and modified after input from Native Hawaiian cultural practitioners and the Moloka‘i fishing community (Fig. 1). Species-specific habitat suitability was inferred from the 2013–2016 Marine Biogeographic Assessment of the Main Hawaiian Islands (Costa & Kendall, 2016). We designated coral habitat as areas with 5–90% coral cover, or ≥1 site-specific coral species richness, for a total of 127 spawning sites on Moloka‘i. Habitat for reef invertebrates followed coral habitat, with additional sites added after community feedback for a total of 136 sites. Areas with a predicted reef fish biomass of 58–1,288 g/m2 were designated as reef fish habitat (Stamoulis et al., 2016), for a total of 109 spawning sites. Sand habitat was designated as 90–100% uncolonized for a total of 115 sites. Intertidal habitat was designated as any rocky shoreline area not covered by sand or mud, for a total of 87 sites. Number of adults was assumed equal at all sites. For regional analysis, we pooled sites into groups of two to 11 sites based on benthic habitat and surrounding geography (Fig. 1A). Adjacent sites were grouped if they shared the same benthic habitat classification and prevailing wave direction, and/or were part of the same reef tract.

Figure 1: Spawning sites used in the model by species. (A) C. perspicillatus, S. rubroviolaceus, P. porphyreus, C. strigosus, C. ignoblis, and C. melampygus, n = 109; (B) P. meandrina, n = 129;(C) O. cyanea and Panulirus spp., n = 136; (D) P. sexfilis, n = 115; and (E) Cellana spp., n = 87. Region names are displayed over associated spawning sites for fish species in (A). Regions are made up of two to 11 sites, grouped based on coastal geography and surrounding benthic habitat, and are designated in (A) by adjacent colored dots. Kalaupapa National Historical Park is highlighted in light green in (A). Source–sink dynamics and local retention

Dispersal distance was measured via the distm function in the R package ‘geosphere’, which calculates distance between geographical points via the Haversine formula (Hijmans, 2016). This distance, measured between spawn and settlement locations, was used to calculate dispersal kernels to examine and compare species-specific distributions. We also measured local retention, or the percentage of successful settlers from a site that were retained at that site (i.e., settlers at site A that originated from site A/total successful settlers that originated from site A). To estimate the role of specific sites around Moloka‘i, we also calculated a source–sink index for each species (Holstein, Paris & Mumby, 2014; Wren et al., 2016). This index defines sites as either a source, in which a site’s successful export to other sites is greater than its import, or a sink, in which import from other sites is greater than successful export. It is calculated by dividing the difference between number of successfully exported and imported larvae by the sum of all successfully exported and imported larvae. A value <0 indicates that a site acts as a net sink, while a value >0 indicates that a site acts as a net source. While we measured successful dispersal to adjacent islands, we did not spawn larvae from them, and therefore these islands represent exogenous sinks. For this reason, settlement to other islands was not included in source–sink index calculations.

We also calculated settlement proportion between different regions for each species (Calabrese & Fagan, 2004). We calculated the forward settlement proportion, i.e., the proportion of settlers from a specific settlement site (s) originating from an observed origin site (o), by scaling the number of successful settlers from site o settling at site s to all successful settlers originating from site o. Forward proportion can be represented as Pso = Sos∕∑So. We also calculated rearward settlement proportion, or the proportion of settlers from a specific origin site (o) observed at settlement site (s), by scaling the number of settlers observed at site s originating from site o to all settlers observed at site s. The rearward proportion can be represented as Pos = Sos∕∑Ss.

Graph-theoretic analysis

To quantify connections between sites, we applied graph theory to population connectivity (Treml et al., 2008; Holstein, Paris & Mumby, 2014). Graph theoretic analysis is highly scalable and can be used to examine fine-scale networks between reef sites up to broad-scale analyses between islands or archipelagos, mapping to both local and regional management needs. It also allows for both network- and site-specific metrics, enabling the comparison of connectivity between species and habitat sites as well as highlighting potential multi-generational dispersal corridors. Graph theory also provides a powerful tool for spatial visualization, allowing for rapid, intuitive communication of connectivity results to researchers, managers, and the public alike. This type of analysis can be used to model pairwise relationships between spatial data points by breaking down individual-based output into a series of nodes (habitat sites) and edges (directed connections between habitat sites). We then used these nodes and edges to examine the relative importance of each site and dispersal pathway to the greater pattern of connectivity around Moloka‘i, as well as differences in connectivity patterns between species (Treml et al., 2008; Holstein, Paris & Mumby, 2014). We used the R package ‘igraph’ to examine several measures of within-island connectivity (Csardi & Nepusz, 2006). Edge density, or the proportion of realized edges out of all possible edges, is a multi-site measure of connectivity. Areas with a higher edge density have more direct connections between habitat sites, and thus are more strongly connected. We measured edge density along and between the north, south, east, and west coasts of Moloka‘i to examine possible population structure and degree of exchange among the marine resources of local communities.

The distribution of shortest path length is also informative for comparing overall connectivity. In graph theory, a shortest path is the minimum number of steps needed to connect two sites. For example, two sites that exchange larvae in either direction are connected by a shortest path of one, whereas if they both share larvae with an intermediate site but not with each other, they are connected by a shortest path of two. In a biological context, shortest path can correspond to number of generations needed for exchange: sites with a shortest path of two require two generations to make a connection. Average shortest path, therefore, is a descriptive statistic to estimate connectivity of a network. If two sites are unconnected, it is possible to have infinite-length shortest paths; here, these infinite values were noted but not included in final analyses.

Networks can also be broken in connected components (Csardi & Nepusz, 2006). A weakly connected component (WCC) is a subgraph in which all nodes are not reachable by other nodes. A network split into multiple WCCs indicates separate populations that do not exchange any individuals, and a large number of WCCs indicates a low degree of island-wide connectivity. A strongly connected component (SCC) is a subgraph in which all nodes are directly connected and indicates a high degree of connectivity. A region with many small SCCs can indicate high local connectivity but low island-wide connectivity. Furthermore, component analysis can identify cut nodes, or nodes that, if removed, break a network into multiple WCCs. Pinpointing these cut nodes can identify potential important sites for preserving a population’s connectivity, and could inform predictions about the impact of site loss (e.g., a large-scale coral bleaching event) on overall connectivity.

On a regional scale, it is important to note which sites are exporting larvae to, or importing larvae from, other sites. To this end, we examined in-degree and out-degree for each region. In-degree refers to the number of inward-directed edges to a specific node, or how many other sites provide larvae into site ‘A’. Out-degree refers to the number of outward-directed edges from a specific node, or how many sites receive larvae from site ‘A’. Habitat sites with a high out-degree seed a large number of other sites, and indicate potentially important larval sources, while habitat sites with a low in-degree rely on a limited number of larval sources and may therefore be dependent on connections with these few other sites to maintain population size. Finally, betweenness centrality (BC) refers to the number of shortest paths that pass through a given node, and may therefore indicate connectivity pathways or ‘chokepoints’ that are important to overall connectivity on a multigenerational timescale. BC was weighted with the proportion of dispersal as described in the preceding section. We calculated in-degree, out-degree, and weighted betweenness centrality for each region in the network for each species.

As with the source–sink index, we did not include sites on islands other than Moloka‘i in our calculations of edge density, shortest paths, connected components, cut nodes, in- and out-degree, or betweenness centrality in order to focus on within-island patterns of connectivity.

Results Effects of biological parameters on fine-scale connectivity patterns

The species-specific parameters that were available to parameterize the dispersal models substantially influenced final output (Fig. 2). The proportion of successful settlers (either to Moloka‘i or to neighboring islands) varied widely by species, from 2% (Panulirus spp.) to 25% (Cellana spp.). Minimum pelagic duration and settlement success were negatively correlated (e.g., an estimated −0.79 Pearson correlation coefficient). Species modeled with batch spawning at a specific moon phase and/or time of day (Cellana spp., P. meandrina, and C. ignoblis) displayed slightly higher settlement success than similar species modeled with constant spawning over specific months. On a smaller scale, we also examined average site-scale local retention, comparing only retention to the spawning site versus other sites on Moloka‘i (Fig. 2). Local retention was lowest for Caranx spp. (<1%) and highest for O. cyanea and P. sexfilis (8.1% and 10%, respectively).

Figure 2: Summary statistics for each species network. Summary statistics are displayed in order of increasing minimum pelagic larval duration from left to right. Heatmap colors are based on normalized values from 0–1 for each analysis. Successful settlement refers to the proportion of larvae settled out of the total number of larvae spawned. Local retention is measured as the proportion of larvae spawned from a site that settle at the same site. Shortest path is measured as the minimum number of steps needed to connect two sites. Strongly connected sites refers to the proportion of sites in a network that belong to a strongly connected component. Mean dispersal distance is measured in kilometers from spawn site to settlement site.

We measured network-wide connectivity via distribution of shortest paths, or the minimum number of steps between a given two nodes in a network, only including sites on Moloka‘i (Fig. 2). O. cyanea and P. sexfilis showed the smallest shortest paths overall, meaning that on average, it would take fewer generations for these species to demographically bridge any given pair of sites. Using maximum shortest path, it could take these species three generations at most to connect sites. Cellana spp. and P. meandrina, by comparison, could take as many as five generations. Other medium- and long-dispersing species showed relatively equivalent shortest-path distributions, with trevally species showing the highest mean path length and therefore the lowest island-scale connectivity.

The number and size of weakly-connected and strongly-connected components in a network is also an informative measure of connectivity (Fig. 2). No species in our study group was broken into multiple weakly-connected components; however, there were species-specific patterns of strongly connected sites. O. cyanea and P. sexfilis were the most strongly connected, with all sites in the network falling into a single SCC. Cellana spp. and P. meandrina each had approximately 60% of sites included in a SCC, but both show fragmentation with seven and six SCCs respectively, ranging in size from two to 22 sites. This SCC pattern suggests low global connectivity but high local connectivity for these species. Medium and long dispersers showed larger connected components; 70% of parrotfish sites fell within two SCCs; 40% of P. porphyreus sites fell within two SCCs; 70% of C. strigosus sites, 55% of C. melampygus sites, and 40% of Panulirus sites fell within a single SCC. In contrast, only 26% of C. ignoblis sites fell within a single SCC. It is also important to note that the lower connectivity scores observed in long-dispersing species likely reflect a larger scale of connectivity. Species with a shorter PLD are highly connected at reef and island levels but may show weaker connections between islands. Species with a longer PLD, such as trevally or spiny lobster, are likely more highly connected at inter-island scales which reflects the lower connectivity scores per island shown here.

Figure 3: Dispersal distance density kernels. Dispersal distance is combined across species by minimum pelagic larval duration (PLD) length in days (short, medium, or long). Most short dispersers settle close to home, while few long dispersers are retained at or near their spawning sites.

Minimum PLD was positively correlated with mean dispersal distance (e.g., an estimated 0.88 Pearson correlation coefficient with minimum pelagic duration loge-transformed to linearize the relationship), and dispersal kernels differed between species that are short dispersers (3–25 days), medium dispersers (30–50 days), or long dispersers (140–270 days) (Fig. 3). Short dispersers travelled a mean distance of 24.06 ± 31.33 km, medium dispersers travelled a mean distance of 52.71 ± 40.37 km, and long dispersers travelled the farthest, at a mean of 89.41 ± 41.43 km. However, regardless of PLD, there were essentially two peaks of mean dispersal: a short-distance peak of <30 km, and a long-distance peak of roughly 50–125 km (Fig. 3). The short-distance peak largely represents larvae that settle back to Moloka‘i, while the long-distance peak largely represents settlement to other islands; the low point between them corresponds to deep-water channels between islands, i.e., unsuitable habitat for settlement. Median dispersal distance for short dispersers was substantially less than the mean at 8.85 km, indicating that most of these larvae settled relatively close to their spawning sites, with rare long-distance dispersal events bringing up the average. Median distance for medium (54.22 km) and long (91.57 km) dispersers was closer to the mean, indicating more even distance distributions and thus a higher probability of long-distance dispersal for these species. Maximum dispersal distance varied between ∼150–180 km depending on species, except for the spiny lobster Panulirus spp., with a PLD of 270 d and a maximum dispersal distance of approximately 300 km.

Settlement to Moloka‘i and other islands in the archipelago

Different species showed different forward settlement proportion to adjacent islands (Fig. 4), although every species in the study group successfully settled back to Moloka‘i. P. meandrina showed the highest percentage of island-scale local retention (82%), while C. ignoblis showed the lowest (7%). An average of 74% of larvae from short-dispersing species settled back to Moloka‘i, as compared to an average of 41% of medium dispersers and 9% of long dispersers. A large proportion of larvae also settled to O‘ahu, with longer PLDs resulting in greater proportions, ranging from 14% of O. cyanea to 88% of C. ignoblis. Moloka‘i and O‘ahu were the most commonly settled islands by percentage. Overall, settlement from Moloka‘i to Lana‘i, Maui, Kaho‘olawe, and Hawai‘i was somewhat lower. Larvae of every species settled to Lana‘i, and settlement to this island made up less than 5% of settled larvae across all species. Likewise, settlement to Maui made up less than 7% of settlement across species, with P. meandrina as the only species that had no successful paths from Moloka‘i to Maui. Settlement to Kaho‘olawe and Hawai‘i was less common, with the exception of Panulirus spp., which had 16% of all settled larvae on Hawai‘i.

Figure 4: Forward settlement from Moloka’i to other islands. Proportion of simulated larvae settled to each island from Moloka‘i by species, organized in order of increasing minimum pelagic larval duration from left to right.

We also examined coast-specific patterns of rearward settlement proportion to other islands, discarding connections with a very low proportion of larvae (<0.1% of total larvae of that species settling to other islands). Averaged across species, 83% of larvae settling to O‘ahu from Moloka‘i were spawned on the north shore of Moloka‘i, with 12% spawned on the west shore (Fig. S4). Spawning sites on the east and south shores contributed <5% of all larvae settling to O‘ahu from Moloka‘i. The east and south shores of Moloka‘i had the highest average percentage of larvae settling to Lana‘i from Moloka‘i, at 78% and 20% respectively, and to Kaho‘olawe from Moloka‘i at 63% and 34%. Of the species that settled to Maui from Moloka‘i, on average most were spawned on the east (53%) or north (39%) shores, as were the species that settled to Hawai‘i Island from Moloka‘i (22% east, 76% north). These patterns indicate that multiple coasts of Moloka‘i have the potential to export larvae to neighboring islands.

Temporal settlement profiles also varied by species (Fig. 5). Species modeled with moon-phase spawning and relatively short settlement windows (Cellana spp. and C. ignoblis) were characterized by discrete settlement pulses, whereas other species showed settlement over a broader period of time. Some species also showed distinctive patterns of settlement to other islands; our model suggests specific windows when long-distance dispersal is possible, as well as times of year when local retention is maximized (Fig. 5).

Figure 5: Species-specific temporal recruitment patterns. Proportion densities of settlement to specific islands from Moloka‘i based on day of year settled, by species. Rare dispersal events (e.g., Maui or Lana‘i for Cellana spp.) appear as narrow spikes, while broad distributions generally indicate more common settlement pathways. Regional patterns of connectivity in Moloka‘i coastal waters

Within Moloka‘i, our model predicts that coast-specific population structure is likely; averaged across all species, 84% of individuals settled back to the same coast on which they were spawned rather than a different coast on Moloka‘i. Excluding connections with a very low proportion of larvae (<0.1% of total larvae of that species that settled to Moloka‘i), we found that the proportion of coast-scale local retention was generally higher than dispersal to another coast, with the exception of the west coast (Fig. 6A). The north and south coasts had a high degree of local retention in every species except for the long-dispersing Panulirus spp., and the east coast also had high local retention overall. Between coasts, a high proportion of larvae that spawned on the west coast settled on the north coast, and a lesser amount of larvae were exchanged from the east to south and from the north to east. With a few species-specific exceptions, larval exchange between other coasts of Moloka‘i was negligible.

Figure 6: Coast-by-coast patterns of connectivity on Moloka‘i. (A) Average rearward settlement proportion by species per pair of coastlines, calculated by the number of larvae settling at site s from site o divided by all settled larvae at site s. Directional coastline pairs (Spawn > Settlement) are ordered from left to right by increasing median settlement proportion. (B) Heatmap of edge density for coast-specific networks by species. Density is calculated by the number of all realized paths out of total possible paths, disregarding directionality.

We also calculated edge density, including all connections between coasts on Moloka‘i regardless of settlement proportion (Fig. 6B). The eastern coast was particularly well-connected, with an edge density between 0.14 and 0.44, depending on the species. The southern shore showed high edge density for short and medium dispersers (0.16–0.39) but low for long dispersers (<0.005). The north shore also showed relatively high edge density (0.20 on average), although these values were smaller for long dispersers. The west coast showed very low edge density, with the exceptions of O. cyanea (0.37) and P. sexfilis (0.13). Virtually all networks that included two coasts showed lower edge density. One exception was the east/south shore network, which had an edge density of 0.10–0.65 except for Cellana spp. Across species, edge density between the south and west coasts was 0.12 on average, and between the east and west coasts was 0.04 on average. Edge density between north and south coasts was particularly low for all species (<0.05), a divide that was especially distinct in Cellana spp. and P. meandrina, which showed zero realized connections between these coasts. Although northern and southern populations are potentially weakly connected by sites along the eastern ( P. meandrina) or western (Cellana spp.) shores, our model predicts very little, if any, demographic connectivity.

To explore patterns of connectivity on a finer scale, we pooled sites into regions (as defined in Fig. 1) in order to analyze relationships between these regions. Arranging model output into node-edge networks clarified pathways and regions of note, and revealed several patterns which did not follow simple predictions based on PLD (Fig. 7). Cellana spp. and P. meandrina showed the most fragmentation, with several SCCs and low connectivity between coasts. Connectivity was highest in O. cyanea and P. sexfilis, which had a single SCC containing all regions. Medium and long dispersers generally showed fewer strongly connected regions on the south shore than the north shore, with the exception of C. strigosus. P. porphyreus showed more strongly connected regions east of Kalaupapa but lower connectivity on the western half of the island.

Figure 7: Moloka’i connectivity networks by species. Graph-theoretic networks between regions around Moloka’i by species arranged in order of minimum pelagic larval duration. (A–D) Short dispersers (3–25 days), (E–G) medium dispersers (30–50 days), and (H–J) long dispersers (140–270 days). Node size reflects betweenness centrality of each region, scaled per species for visibility. Node color reflects out-degree of each region; yellow nodes have a low out-degree, red nodes have a medium out-degree, and black nodes have a high out-degree. Red edges are connections in a strongly connected component, while gray edges are not part of a strongly connected component (although may still represent substantial connections). Edge thickness represents log-transformed proportion of dispersal along that edge.

Region-level networks showed both species-specific and species-wide patterns of connectivity (Fig. 8). With a few exceptions, sites along the eastern coast—notably, Cape Halawa and Pauwalu Harbor—showed relatively high betweenness centrality, and may therefore act as multigenerational pathways between north-shore and south-shore populations. In Cellana spp., Leinapapio Point and Mokio Point had the highest BC, while in high-connectivity O. cyanea and P. sexfilis, regions on the west coast had high BC scores. P. meandrina and C. strigosus showed several regions along the south shore with high BC. For Cellana spp. and P. meandrina, regions in the northeast had the highest out-degree, and therefore seeded the greatest number of other sites with larvae (Fig. 8). Correspondingly, regions in the northwest (and southwest in the case of P. meandrina) showed the highest in-degree. For O. cyanea and P. sexfilis, regions on the western and southern coasts showed the highest out-degree. For most species, both out-degree and in-degree were generally highest on the northern and eastern coasts, suggesting higher connectivity in these areas.

Figure 8: Region-level summary statistics across all species. Betweenness centrality is a measure of the number of paths that pass through a certain region; a high score suggests potentially important multi-generation connectivity pathways. In-degree and out-degree refer to the amount of a node’s incoming and outgoing connections. Betweenness centrality, in-degree, and out-degree have all been normalized to values between 0 to 1 per species. Local retention is measured as the proportion of larvae that settled back to their spawn site out of all larvae spawned at that site. Source-sink index is a measure of net export or import; negative values (blue) indicate a net larval sink, while positive values (red) indicate a net larval source. White indicates that a site is neither a strong source nor sink. Gray values for Cellana spp. denote a lack of suitable habitat sites in that particular region.

Several species-wide hotspots of local retention emerged, particularly East Kalaupapa Peninsula/Leinaopapio Point, the northeast point of Moloka‘i, and the middle of the south shore. Some species also showed some degree of local retention west of Kalaupapa Peninsula. While local retention was observed in the long-dispersing Caranx spp. and Panulirus spp., this amount was essentially negligible. In terms of source–sink dynamics, Ki‘oko‘o, Pu‘ukaoku Point, and West Kalaupapa Peninsula, all on the north shore, were the only sites that consistently acted as a net source, exporting more larvae than they import (Fig. 8). Kaunakakai Harbor, Lono Harbor, and Mokio Point acted as net sinks across all species. Puko‘o, Pauwalu Harbor, and Cape Halawa were either weak net sources or neither sources nor sinks, which corresponds to the high levels of local retention observed at these sites. Pala‘au and Mo‘omomi acted as either weak sinks or sources for short dispersers and as sources for long dispersers.

Only four networks showed regional cut-nodes, or nodes that, if removed, break a network into multiple weakly-connected components (Fig. S5). Cellana spp. showed two cut-nodes: Mokio Point in northwest Moloka‘i and La‘au Point in southwest Moloka‘i, which if removed isolated Small Bay and Lono Harbor, respectively. C. perspicillatus, and S. rubroviolaceus showed a similar pattern in regards to Mokio Point; removal of this node isolated Small Bay in this species as well. In C. ignoblis, loss of Pauwalu Harbor isolated Lono Harbor, and loss of Pala‘au isolated Ilio Point on the northern coast. Finally, in Panulirus spp., loss of Leinaopapio Point isolated Papuhaku Beach, since Leinapapio Point was the only larval source from Moloka‘i for Papuhaku Beach in this species.

Figure 9: Connectivity matrix for larvae spawned on Kalaupapa Peninsula. Includes larvae settled on Molokaí (regions below horizontal black line) and those settled on other islands (regions above horizontal black line), spawned from either the east (E) or west (W) coast of Kalaupapa. Heatmap colors represent rearward proportion, calculated by the number of larvae settling at site s from site o divided by all settled larvae at site s. White squares indicate no dispersal along this path. The role of Kalaupapa Peninsula in inter- and intra-island connectivity

Our model suggests that Kalaupapa National Historical Park may play a role in inter-island connectivity, especially in terms of long-distance dispersal. Out of all regions on Moloka‘i, East Kalaupapa Peninsula was the single largest exporter of larvae to Hawai‘i Island, accounting for 19% of all larvae transported from Moloka‘i to this island; West Kalaupapa Peninsula accounted for another 10%. The park also contributed 22% of all larvae exported from Moloka‘i to O‘ahu, and successfully exported a smaller percentage of larvae to Maui, Lana‘i, and Kaho‘olawe (Fig. 9). Kalaupapa was not marked as a cut-node for any species, meaning that full population breaks are not predicted in the case of habitat or population loss in this area. Nevertheless, in our model Kalaupapa exported larvae to multiple regions along the north shore in all species, as well as regions along the east, south, and/or west shores in most species networks (Figs. 9 and 10). The park may play a particularly important role for long-dispersing species; settlement from Kalaupapa made up 18%–29% of all successful settlement in Caranx spp. and Panulirus spp., despite making up only 12% of spawning sites included in the model. In C. strigosus, S. rubroviolaceus, and C. strigosus, Kalaupapa showed a particularly high out-degree, or number of outgoing connections to other regions, and West Kalaupapa was also one of the few regions on Moloka‘i that acted as a net larval source across all species (Fig. 8). Our study has also demonstrated that different regions of a marine protected area can potentially perform different roles, even in a small MPA such as Kalaupapa. Across species, the east coast of Kalaupapa showed a significantly higher betweenness centrality than the west (p = 0.028), while the west coast of Kalauapapa showed a significantly higher source–sink index than the east (p = 2.63e−9).

Figure 10: Larval spillover from Kalaupapa National Historical Park. Site-level dispersal to sites around Moloka‘i from sites in the Kalaupapa National Historical Park protected area, by species. (A–D) Short dispersers (3–25 days), (E–G) medium dispersers (30–50 days), and (H–J) long dispersers (140–270 days). Edge color reflects proportion of dispersal along that edge; red indicates higher proportion while yellow indicates lower proportion. Kalaupapa National Historical Park is highlighted in light green. Discussion Effects of biological and physical parameters on connectivity

We incorporated the distribution of suitable habitat, variable reproduction, variable PLD, and ontogenetic changes in swimming ability and empirical vertical distributions of larvae into our model to increase biological realism, and assess how such traits impact predictions of larval dispersal. The Wong-Ala et al. (2018) IBM provides a highly flexible model framework that can easily be modified to incorporate either additional species-specific data or entirely new biological traits. In this study, we included specific spawning seasons for all species, as well as spawning by moon phase for Cellana spp., P. meandrina, and C. ignoblis because such data was available for these species. It proved difficult to obtain the necessary biological information to parameterize the model, but as more data about life history and larval behavior become available, such information can be easily added for these species and others. Some potential additions to future iterations of the model might include density of reproductive-age adults within each habitat patch, temperature-dependent pelagic larval duration (Houde, 1989), ontogenetic-dependent behavioral changes such as orientation and diel vertical migration (Fiksen et al., 2007; Paris, Chérubin & Cowen, 2007), pre-competency period, and larval habitat preferences as such information becomes available.

In this study, we have demonstrated that patterns of fine-scale connectivity around Moloka‘i are largely species-specific and can vary with life history traits, even in species with identical pelagic larval duration. For example, the parrotfish S. rubroviolaceus and C. perspicillatus show greater connectivity along the northern coast, while the goatfish P. porphyreus shows higher connectivity along the eastern half of the island. These species have similar PLD windows, but vary in dispersal depth and spawning season. Spawning season and timing altered patterns of inter-island dispersal (Fig. 5) as well as overall settlement success, which was slightly higher in species that spawned by moon phase (Fig. 2). While maximum PLD did appear play a role in the probability of rare long-distance dispersal, minimum PLD appears to be the main driver of average dispersal distance (Fig. 2). Overall, species with a shorter minimum PLD had higher settlement success, shorter mean dispersal distance, higher local retention, and higher local connectivity as measured by the amount and size of strongly connected components.

The interaction of biological and oceanographic factors also influenced connectivity patterns. Because mesoscale current patterns can vary substantially over the course of the year, the timing of spawning for certain species may be critical for estimating settlement (Wren et al., 2016; Wong-Ala et al., 2018). Intermittent ocean processes may influence the probability of local retention versus long-distance dispersal; a large proportion of larvae settled to O‘ahu, which is somewhat surprising given that in order to settle from Moloka‘i to O‘ahu, larvae must cross the Kaiwi Channel (approx. 40 km). However, the intermittent presence of mesoscale gyres may act as a stabilizing pathway across the channel, sweeping larvae up either the windward or leeward coast of O‘ahu depending on spawning site. Likewise, in our model long-distance dispersal to Hawai‘i Island was possible at certain times of the year due to a gyre to the north of Maui; larvae were transported from Kalaupapa to this gyre, where they were carried to the northeast shore of Hawai‘i (Fig. S6). Preliminary analysis also suggests that distribution of larval depth influenced edge directionality and size of connected components (Fig. 7); surface currents are variable and primarily wind-driven, giving positively-buoyant larvae different patterns of dispersal than species that disperse deeper in the water column (Fig. S7).

Model limitations and future perspectives

Our findings have several caveats. Because fine-scale density estimates are not available for our species of interest around Moloka’i, we assumed that fecundity is equivalent at all sites. This simplification may lead us to under- or over-estimate the strength of connections between sites. Lack of adequate data also necessitated estimation or extrapolation from congener information for larval traits such as larval dispersal depth and PLD. Since it is difficult if not impossible to identify larvae to the species level without genetic analysis, we used genus-level larval distribution data (Boehlert & Mundy, 1996), or lacking that, an estimate of 50–100 m as a depth layer that is generally more enriched with larvae (Boehlert, Watson & Sun, 1992; Wren & Kobayashi, 2016). We also estimated PLD in several cases using congener-level data (see Table 1). While specificity is ideal for making informed management decisions about a certain species, past sensitivity analysis has shown that variation in PLD length does not greatly impact patterns of dispersal in species with a PLD of >40 days (Wren & Kobayashi, 2016).

Although our MITgcm current model shows annual consistency, it only spans two and a half years chosen as neutral state ‘average’ ocean conditions. It does not span any El Niño or La Niña (ENSO) events, which cause wide-scale sea-surface temperature anomalies and may therefore affect patterns of connectivity during these years. El Niño can have a particularly strong impact on coral reproduction, since the warm currents associated with these events can lead to severe temperature stress (Glynn & D’Croz, 1990; Wood et al., 2016). While there has been little study to date on the effects of ENSO on fine-scale connectivity, previous work has demonstrated increased variability during these events. For example, Wood et al. (2016) showed a decrease in eastward Pacific dispersal during El Niño years, but an increase in westward dispersal, and Treml et al. (2008) showed unique connections in the West Pacific as well as an increase in connectivity during El Niño. While these effects are difficult to predict, especially at such a small scale, additional model years would increase confidence in long-term connectivity estimations. Additionally, with a temporal resolution of 24 h, we could not adequately address the role of tides on dispersal, and therefore did not include them in the MITgcm. Storlazzi et al. (2017) showed that tidal forces did affect larval dispersal in Maui Nui, underlining the importance of including both fine-scale, short-duration models and coarser-scale, long-duration models in final management decisions.

We also limit our model’s scope geographically. Our goal was to determine whether we could resolve predictive patterns at this scale relevant to management. Interpretation of connectivity output can be biased by spatial resolution of the ocean model, since complex coastal processes can be smoothed and therefore impact larval trajectories. To limit this bias, we focused mainly on coastal and regional connectivity on scales greater than the current resolution. We also used the finest-scale current products available for our study area, and our results show general agreement with similar studies of the region that use a coarser resolution (Wren & Kobayashi, 2016) and a finer resolution (Storlazzi et al., 2017). Also, while knowledge of island-scale connectivity is important for local management, it does disregard potential connections from other islands. In our calculations of edge density, betweenness centrality and source-sink index, we included only settlement to Moloka‘i, discarding exogenous sinks that would bias our analysis. Likewise, we cannot predict the proportion of larvae settling to other islands that originated from Moloka‘i, or the proportion of larvae on Moloka‘i that originated from other islands.

It is also important to note scale in relation to measures of connectivity; we expect that long-dispersing species such as Caranx spp. and Panulirus spp. will show much higher measures of connectivity when measured across the whole archipelago as opposed to a single island. The cut-nodes observed in these species may not actually break up populations on a large scale due to this inter-island connectivity. Nevertheless, cut-nodes in species with short- and medium-length PLD may indeed mark important habitat locations, especially in terms of providing links between two otherwise disconnected coasts. It may be that for certain species or certain regions, stock replenishment relies on larval import from other islands, underscoring the importance of MPA selection for population maintenance in the archipelago as a whole.

Implications for management

Clearly, there is no single management approach that encompasses the breadth of life history and behavior differences that impact patterns of larval dispersal and connectivity (Toonen et al., 2011; Holstein, Paris & Mumby, 2014). The spatial, temporal, and species-specific variability suggested by our model stresses the need for multi-scale management, specifically tailored to local and regional connectivity patterns and the suite of target species. Even on such a small scale, different regions around the island of Moloka‘i can play very different roles in the greater pattern of connectivity (Fig. 8); sites along the west coast, for example, showed fewer ingoing and outgoing connections than sites on the north coast, and therefore may be more at risk of isolation. Seasonal variation should also be taken into account, as mesoscale current patterns (and resulting connectivity patterns) vary over the course of a year. Our model suggests species-specific temporal patterns of settlement (Fig. 5); even in the year-round spawner O. cyanea, local retention to Moloka‘i as well as settlement to O‘ahu was maximized in spring and early summer, while settlement to other islands mostly occurred in late summer and fall.

Regions that show similar network dynamics may benefit from similar management strategies. Areas that act as larval sources either by proportion of larvae (high source–sink index) or number of sites (high out-degree) should receive management consideration. On Moloka‘i, across all species in our study, these sources fell mostly on the northern and eastern coasts. Maintenance of these areas is especially important for downstream areas that depend on upstream populations for a source of larvae, such as those with a low source–sink index, low in-degree, and/or low local retention. Across species, regions with the highest betweenness centrality scores fell mainly in the northeast (Cape Halawa and Pauwalu Harbor). These areas should receive consideration as potentially important intergenerational pathways, particularly as a means of connecting north-coast and south-coast populations, which showed a lack of connectivity both in total number of connections (edge density) and proportion of larvae. Both of these connectivity measures were included because edge density includes all connections, even those with a very small proportion of larvae, and may therefore include rare dispersal events that are of little relevance to managers. Additionally, edge density comparisons between networks should be viewed with the caveat that these networks do not necessarily have the same number of nodes. Nevertheless, both edge density and proportion show very similar patterns, and include both demographically-relevant common connections as well as rare connections that could influence genetic connectivity.

Management that seeks to establish a resilient network of spatially managed areas should also consider the preservation of both weakly-connected and strongly-connected components, as removal of key cut-nodes (Fig. S5) breaks up a network. Sites within a SCC have more direct connections and therefore may be more resilient to local population loss. Care should be taken to preserve breeding populations at larval sources, connectivity pathways, and cut-nodes within a SCC, since without these key sites the network can fragment into multiple independent SCCs instead of a single stable network. This practice may be especially important for species for which we estimate multiple small SCCs, such as Cellana spp. or P. meandrina.

Kalaupapa Peninsula emerged as an important site in Moloka‘i population connectivity, acting as a larval source for other regions around the island. The Park seeded areas along the north shore in all species, and also exported larvae to sites along the east and west shores in all species except P. meandrina and Cellana spp. Additionally, it was a larval source for sites along the south shore in the fishes C. perspicillatus, S. rubroviolaceus, and C. strigosus as well as Panulirus spp. Western Kalaupapa Peninsula was one of only three regions included in the analysis (the others being Ki‘oko‘o and Pu‘ukaoku Point, also on the north shore) that acted as a net larval source across all species. Eastern Kalaupapa Peninsula was particularly highly connected, and was part of a strongly connected component in every species. The Park also emerged as a potential point of connection to adjacent islands, particularly to O‘ahu and Hawai‘i. Expanding the spatial scale of our model will further elucidate Kalaupapa’s role in the greater pattern of inter-island connectivity.

In addition to biophysical modeling, genetic analyses can be used to identify persistent population structure of relevance to managers (Cowen et al., 2000; Casey, Jardim & Martinsohn, 2016). Our finding that exchange among islands is generally low in species with a short- to medium-length PLD agrees with population genetic analyses of marine species in the Hawaiian Islands (Bird et al., 2007; Rivera et al., 2011; Toonen et al., 2011; Concepcion, Baums & Toonen, 2014). On a finer scale, we predict some level of shoreline-specific population structure for most species included in the study (Fig. 6). Unfortunately, genetic analyses to date have been performed over too broad a scale to effectively compare to these fine-scale connectivity predictions around Moloka‘i or even among locations on adjacent islands. These model results justify such small scale genetic analyses because there are species, such as the coral P. meandrina, for which the model predicts clear separation of north-shore and south-shore populations which should be simple to test using genetic data. To validate these model predictions with this technique, more fine-scale population genetic analyses are needed.

Conclusions

The maintenance of demographically connected populations is important for conservation. In this study, we contribute to the growing body of work in biophysical connectivity modeling, focusing on a region and suite of species that are of relevance to resource managers. Furthermore, we demonstrate the value of quantifying fine-scale relationships between habitat sites via graph-theoretic methods. Multispecies network analysis revealed persistent patterns that can help define region-wide practices, as well as species-specific connectivity that merits more individual consideration. We demonstrate that connectivity is influenced not only by PLD, but also by other life-history traits such as spawning season, moon-phase spawning, and ontogenetic changes in larval depth. High local retention of larvae with a short- or medium-length PLD is consistent with population genetic studies of the area. We also identify regions of management importance, including West Kalaupapa Peninsula, which acts as a consistent larval source across species; East Kalaupapa Peninsula, which is a strongly connected region in every species network, and Pauwalu Harbor/Cape Halawa, which may act as important multigenerational pathways. Connectivity is only one piece of the puzzle of MPA effectiveness, which must also account for reproductive population size, long-term persistence, and post-settlement survival (Burgess et al., 2014). That being said, our study provides a quantitative roadmap of potential demographic connectivity, and thus presents an effective tool for estimating current and future patterns of dispersal around Kalaupapa Peninsula and around Moloka‘i as a whole.

Supplemental Information Current patterns in the model domain.

Current direction and velocity is displayed at a depth of 55 m below sea surface on (A) March 31st, 2011, (B) June 30th, 2011, (C) September 30th, 2011, and (D) December 31st, 2011. Arrowhead direction follows current direction, and u/v velocity is displayed through arrow length and color (purple, low velocity, red, high velocity). Domain extends from 198.2°E to 206°E and from 17°N to 22.2°N. The island of Moloka‘i is highlighted in red.

Subset of validation drifter paths.

Drifter paths in black and corresponding model paths are colored by drifter ID. All drifter information was extracted from the GDP Drifter Data Assembly Center (Elipot et al., 2016). Drifters were included if they fell within the model domain spatially and temporally, and were tested by releasing 1,000 particles on the correct day where they entered the model domain, at the uppermost depth layer of our oceanographic model (0–5 m).

Selected larval depth distributions.

Modeled vertical larval distributions for Caranx spp. (left), S. rubroviolaceus and C. perspicillatus (middle), and P. porphyreus (right), using data from the 1996 NOAA ichthyoplankton vertical distributions data report (Boehlert & Mundy 1996).

Coast-specific rearward settlement patterns by island

Proportion of simulated larvae settled to each island from sites on each coast of Moloka‘i, averaged across all species that successfully settled to that island.

Regional cut-nodes for four species networks

Mokio Point and La‘au Point were cut-nodes for Cellana spp., Mokio Point was a cut-node for C. perspicillatus and S. rubroviolaceus, Pauwalu Harbor and Pala‘au were cut-nodes for C. ignoblis, and Leinaopapio Point was a cut-node for Panulirus spp.

Selected dispersal pathways for Panulirus spp. larvae

500 randomly sampled dispersal pathways for lobster larvae (Panulirus spp.) that successfully settled to Hawai‘i Island after being spawned off the coast of Moloka‘i. Red tracks indicate settlement earlier in the year (February–March), while black tracks indicate settlement later in the year (April–May). Most larvae are transported to the northeast coast of Hawai‘i via a gyre to the north of Maui, while a smaller proportion are transported through Maui Nui.

Eddy differences by depth layer.

Differences in eddy pattern and strength in surface layers (A, 2.5 m) vs. deep layers (B, 55 m) on March 31, 2011. Arrowhead direction follows current direction, and u/v velocity is displayed through arrow length and color (purple, low velocity, red, high velocity). While large gyres remain consistent at different depths, smaller features vary along this gradient. For example, the currents around Kaho‘olawe, the small gyre off the eastern coast of O‘ahu, and currents to the north of Maui all vary in direction and/or velocity.


Avoid Bothersome Garbage Collection Pauses | killexams.com real questions and Pass4sure dumps

Many engineers complain that the non-deterministic behavior of the garbage collector prevents them from utilizing the Java environment for mission-critical applications, especially distributed message-driven displays (GUIs) where user responsiveness is critical. We agree that garbage collection does occur at the worst times: for example, when a user clicks a mouse or a new message enters the system requiring immediate processing. These events must be handled without the delay of in-progress garbage collection. How do we prevent these garbage collection pauses that interfere with the responsiveness of an application ("bothersome pauses")?

We have discovered a very effective technique to prevent bothersome garbage collection pauses and build responsive Java applications. This technique or pattern is especially effective for a distributive message-driven display system with soft real-time constraints. This article details this pattern in three simple steps and provides evidence of the effectiveness of the technique.

Pattern to Control Garbage Collection PausesThe Java environment provides so many benefits to the software community - platform independence, industry momentum, a plethora of resources (online tutorials, code, interest groups, etc.), object-oriented utilities and interfaces (collections, network I/O, Swing display, etc.) that can be plugged in and out - that once you have experienced working with Java it's hard to go back to traditional languages. Unfortunately, in some mission-critical applications, like message-driven GUIs that must be very responsive to user events, the requirements force you to take that step backward. There's no room for multiple second garbage collection pauses. (The garbage collector collects all the "unreachable" references in an application so the space consumed by them can be reused. It's a low-priority thread that usually only takes priority over other threads when the VM is running out of memory.) Do we really have to lose all the benefits of Java? First, let's consider the requirements.

A system engineer should consider imposing requirements for garbage collection like the following list taken from a telecom industry example (see References).1.  GC sequential overhead on a system may not be more than 10% to ensure scalability and optimal use of system resources for maximum throughput.2.  Any single GC pause during the entire application run may be no more than 200ms to meet the latency requirements as set by the protocol between the client and the server, and to ensure good response times by the server.

Armed with these requirements, the system engineer has defined the worst-case behavior in a manner that can be tested.

The next question is: How do we meet these requirements? Alka Gupta and Michael Doyle make excellent suggestions in their article (see References). Their approach is to tune the parameters on the Java Virtual Machine (JVM). We take a slightly different approach that leaves the use of parameter definitions as defined by the JVM to be used as a final tuning technique.

Why not tell the garbage collector what and when to collect?

In other words, control garbage collection via the software architecture. Make the job of the garbage collector easy! This technique can be described as a multiple step pattern. The first step of the pattern is described below as "Nullify Objects." The second step involves forcing garbage collection to occur as delineated in "Forcing Garbage Collection." The final step involves either placing persistent data out of the reach of the collector or into a data pool so that an application will continue to perform well in the long run.

Step 1: Nullify ObjectsMemory leaks strike fear into the hearts of programmers! Not only do they degrade performance, they eventually terminate the application. Yet memory leaks prove very subtle and difficult to debug. The JVM performs garbage collection in the background, freeing the coder from such details, but traps still exist. The biggest danger is placing an object into a collection and forgetting to remove it. The memory used by that object will never be reclaimed.

A programmer can prevent this type of memory leak by setting the object reference and all underlying object references ("deep" objects) to null when the object is no longer needed. Setting an object reference to "null" tells the garbage collector that at least this one reference to the object is no longer needed. Once all references to an object are cleared, the garbage collector is free to reclaim that space. Giving the collector such "hints" makes its job easier and faster. Moreover, a smaller memory footprint also makes an application run faster.

Knowing when to set an object reference to null requires a complete understanding of the problem space. For instance, if the remote receiver allocates the memory space for a message, the rest of the application must know when to release the space back for reuse. Study the domain. Once an object or "subobject" is no longer needed, tell the garbage collector.

Thus, the first step of the pattern is to set objects to null once you're sure they're no longer needed. We call this step "nullify" and include it in the definition of the classes of frequently used objects.

The following code snippet shows a method that "nullifies" a track object. The class members that consist of primitives only (contain no additional class objects) are set to null directly, as in lines 3-5. The class members that contain class objects provide their own nullify method as in line 9.

1 public void nullify () {23 this.threatId = null ;4 this.elPosition = null ;5 this.kinematics = null ;67 if (this.iff != null)8 {9 this.iff.nullify();10 this.iff = null ;11 }12 }

The track nullify is called from the thread that has completed processing the message. In other words, once the message has been stored or processed, that thread tells the JVM it no longer needs that object. Also, if the object was placed in some Collection (like an ArrayList), it's removed from the Collection and set to null.

By setting objects to null in this manner, the garbage collector and thus the JVM can run more efficiently. Train yourself to program with "nullify" methods and their invocation in mind.

Step 2: "Force" Garbage CollectionThe second step of the pattern is to control when garbage collection occurs. The garbage collector, GC, runs as Java priority 1 (the lowest priority). The virtual machine, VM, runs at Java priority 10 (the highest priority). Most books recommend against the usage of Java priority 1 and 10 for assigning priorities to Java applications. In most cases, the GC runs during idle times, generally when the VM is waiting for user input or when the VM has run out of memory. In the latter case, the GC interrupts high-priority processing in the application.

Some programmers like to use the "-Xincgc" directive on the Java command line. This tells the JVM to perform garbage collection in increments when it desires. Again, the timing of the garbage collection may be inopportune. Instead, we suggest that the garbage collector perform a full garbage collection as soon as it can in either or both of two ways:1.  Request garbage collection to happen as soon as possible: This method proves useful when the programmer knows he or she has a "break" to garbage collect. For example, after a large image is loaded into memory and scaled, the memory footprint is large. Forcing a garbage collection to occur at that point is wise. Another good area may be after a large message has been processed in the application and is no longer needed.2.  Schedule garbage collection to occur at a fixed rate: This method is optimal when the programmer does not have a specific moment when he knows his application can stop shortly and garbage collect. Normally, most applications are written in this manner.

Listing 1 introduces a class named "BetterControlOfGC". It's a utility class that provides the methods described earlier. There are two public methods: "suggestGCNow()" and "scheduleRegularGC(milliseconds)" that respectively correspond to the steps described earlier. Line 7 suggests to the VM to garbage collect the unreachable objects as soon as possible. The documentation makes it clear that the garbage collection may not occur instantaneously, but experience has shown that it will be performed as soon as the VM is able to accomplish the task. Invoking the method on line 25 causes garbage collection to occur at a fixed rate as determined by the parameter to the method.

In scheduling the GC to occur at a fixed rate, a garbage collection stimulator task, GCStimulatorTask, is utilized. The code extends the "java.util.timer" thread in line 10. No new thread is created; the processing runs on the single timer thread available beginning with the Java 1.3 environment. Similarly, to keep the processing lean, the GC stimulator follows the Singleton pattern as shown by lines 18-23 and line 27. There can be only one stimulator per application, where an application is any code running on an instance of the JVM.

We suggest that you set the interval at which the garbage collector runs from a Java property file. Thus you can tune the application without having to recompile the code. Write some simple code to read a property file that's either a parameter on the command line or a resource bundle in the class path. Place the command parameter "-verbose:gc" on your executable command line and measure the time it takes to garbage collect. Tune this number until you achieve the results you want. If the budget allows, experiment with other virtual machines and/or hardware.

Step 3: Store Persistent Objects into Persistent Data Areas or Store Long-Lived Objects in PoolsUsing persistent data areas is purely optional. It supports the underlying premise of this article. In order to bind the disruption of the garbage collector in your application, make its job easy. If you know that an object or collection of objects would live for the duration of your application, let the collector know. It would be nice if the Java environment provided some sort of flag that could be placed on objects upon their creation to tell the garbage collector "-keep out". However, there is currently no such means. (The Real-Time Specification for Java describes an area of memory called "Immortal Memory" where objects live for the duration of the application and garbage collection should not run.) You may try using a database; however, this may slow down your application even more. Another solution currently under the Java Community Process is JSR 107. JCache provides a standard set of APIs and semantics that allow a programmer to cache frequently used data objects for the local JVM or across JVMs. This API is still under review and may not be available yet. However, we believe it holds much promise for the Java developer community. Keep this avenue open and in mind for future architectures. What can we do now?

The pooling of objects is not new to real-time programmers. The concept is to create all your expected data objects before you begin processing, then all your data can be placed into structures without the expense of instance creation during processing time. This has the advantage of keeping your memory footprint stable. It has the disadvantage of requiring a "deep copy" method to be written to store the data into the pool. (If you simply set an object to another, you're changing the object reference and not reusing the same space.) The nanosecond expense of the deep copy is far less than that of the object instance creation.

If the data pooling technique is combined with the proper use of the "nullify" technique, garbage collection becomes optimized. The reasons are fairly straightforward:1.  Since the object is set to null immediately after the deep copy, it lives only in the young generation portion of the memory. It does not progress into the older generations of memory and thus takes less of the garbage collector's cycle time.2.  Since the object is nullified immediately and no other reference to it exists in some other collection object in the application, the job of the garbage collector is easier. In other words, the garbage collector does not have to keep track of an object that exists in a collection.

When using data pools, it's wise to use the parameters "-XX:+UseConcMarkSweepGC -XX:MaxTenuringThreshold=0 -XX:SurvivorRatio=128" on the command line. These tell the JVM to move objects on the first sweep from the new generation to the old. It commands the JVM to use the concurrent mark sweep algorithm on the old generation that proves more efficient since it works "concurrently" for a multi-processor platform. For single processor machines, try the "-Xincgc" option. We've seen those long garbage collector pauses, which occur after hours of execution, disappear using this technique and these parameters. Performing well in the long run is the true benefit of this last step.

Performance ResultsTypically, most engineers want proof before changing their approach to designing and coding. Why not? Since we're now suggesting that even Java programmers should be concerned about resource allocation, it better be worth it! Once upon a time, assembly language and C programmers spent time tweaking memory and register usage to improve performance. This step was necessary. Now, as higher-level object-oriented programmers we may disdain this thought. This pattern has dared to imply that such considerations, although not as low level as registers and memory addresses (instead at the object level), are still necessary for high-performance coding. Can it be true?

The underlying premise is that if you know how your engine works, you can drive it better to obtain optimal performance and endurance. This is as true for my 1985 300TD (Mercedes, five cylinder, turbo diesel station wagon) with 265,000 miles as for my Java code running on a HotSpot VM. For instance, knowing that a diesel's optimal performance is when the engine is warm since it relies on compression for power, I let my car warm up before I "push it." Similarly, I don't overload the vehicle with the tons of stuff I could place in the tailgate. HotSpot fits the analogy. Performance improves after the VM "warms up" and compiles the HotSpot code into the native language. I also keep my memory footprint lean and light. The comparison breaks down after awhile, but the basic truth does not change. You can use a system the best when you understand how it works.

Our challenge to you is to take statistics before and after implementing this pattern on just a small portion of your code. Please recognize that the gain will be best exemplified when your application is scaled upward. In other words, the heavier the load on the system, the better the results.

The following statistics were taken after the pattern was applied. They are charted as:1.  Limited nullify method invocation is used where only the incoming messages are not "nullified." (The remainder of the application from which the statistics were taken was left intact with a very lean memory usage.) There is no forced garbage collection.2.  Nullify method invocation and forced garbage collection is utilized.

The test environment is a Microsoft Windows 2000 X86 Family 15 Model 2 Stepping 4 Genuine Intel ~1794MHz laptop running the BEA WebLogic Server 7.0 with Service Pack 7.1 with a physical memory size of 523,704KB. The Java Message Server (JMS server), a track generator, and a tactical display are all running on the same laptop over the local developer network (MAGIC). The server makes no optimizations, even though each application resides locally. The JVMs are treated as if they were distributed across the network. They're running on the J2SE 1.4.1 release.

The test target application is a Java Swing Tactical Display with full panning, zooming, and track-hooking capabilities. It receives bundles of tracks via the Java Message Service that are displayed at their proper location on the given image. Each track is approximately 88 bytes and the overall container size is about 70 bytes. This byte measurement does not include all the additional class information that's also sent during serialization. The container is the message that holds an array of tracks that contains information such as time and number of tracks. For our tests, the tracks are sent at a 1Hz rate. Twenty sets of data are captured.

To illustrate the test environment, a screen capture of a 5,000 track load (4,999 tracks plus the ship) is shown in Figure 1. The background shows tracks rendered with the Military Standard 2525B symbology over an image of the Middle East. The small window titled "Track Generator Desktop" is a minimized window showing the parameters of the test set through the track generator application. Notice that 45 messages had been sent at the time of the screen capture. Directly beneath this window sits the Windows Task Manager. Note that the CPU utilization is at 83%. At first this doesn't seem that bad. But at that rate, there isn't much room for the user to begin zooming, panning, hooking tracks, and so on. The final command window to the right is that of the tactical display application. The parameter "-verbose:gc" is placed on the Java command line (java -verbose:gc myMainApplication.class). The VM is performing the listed garbage collection at its own rate, not by command of the application.

The final test of 10,000 tracks performed extremely poorly. The system does not scale; the CPU is pegged. At this point most engineers may jeer at Java again. Let's take another look after implementing the pattern.

After implementation, where the nullify methods are invoked properly and garbage collection is requested at a periodic interval (2Hz), dramatic improvements are realized. The last test of 10,000 tracks proves that the processor still has plenty of room to do more work. In other words, the pattern scales very well.

Performance SummaryThe pattern to help control garbage collection pauses most definitely improves the overall performance of the application. Notice how well the pattern scales under the heavier track loads in the performance bar chart in Figure 2. The darker middle bar shows the processor utilization at each level of the message (track) load. As the message traffic increases, the processor utilization grows more slowly than without the pattern. The last light-colored bar shows the improved performance. The main strength of the pattern is how well it scales under heavy message loads.

There is another subtle strength to the pattern. This one is difficult to measure since it requires very long-lived tests. If Step 3 is faithfully followed, those horribly long garbage collection pauses that occur after hours of running disappear. This is a key benefit to the pattern since most of our applications are designed to run "forever."

We're confident that many other Java applications would benefit from implementing this very simple pattern.

The steps to control garbage collection pauses are:1.  Set all objects that are no longer in use to null and make sure they're not left within some collection. "Nullify" objects.2.  Force garbage collection to occur both:

  • After some major memory-intense operation (e.g., scaling an image)
  • At a periodic rate that provides the best performance for your application3.  Save long-lived data in a persistent data area if feasible or in a pool of data and use the appropriate garbage collector algorithm.

    By following these three simple steps, you'll avoid those bothersome garbage collection pauses and enjoy all the benefits of the Java environment. It's time the Java environment was fully utilized in mission-critical display systems.

    References

  • Gupta, A., and Doyle, M. "Turbo-Charging the Java HotSpot Virtual Machine, v1.4.x to Improve the Performance and Scalability of Application Servers": http://developer.java.sun.com/developer/ technicalArticles/Programming/turbo/
  • JSR 1, Real-Time Specification for Java: http://jcp.org/en/jsr/detail?id=1
  • Java HotSpot VM options: http://java.sun.com/docs/hotspot/VMOptions.html
  • Java Specification Request for JCache: http://jcp.org/en/jsr/detail?id=107

  • PCI DSS questions answered: Solutions to tough PCI problems | killexams.com real questions and Pass4sure dumps

    During our recent virtual seminar, PCI DSS 2.0: Why the latest update matters to you, experts Ed Moyle and Diana...

    Kelley of SecurityCurve were unable to answer all of the PCI DSS questions they received during their live question-and-answer session. SearchSecurity.com has asked them to give brief responses to each of the unanswered questions, and we've published those questions and responses below to help you solve your unique PCI problems.

    For additional information about the Payment Card Industry Data Security Standard, visit SearchSecurity.com's PCI DSS resources page.

  • Where can we find information about PCI DSS compliance that is focused on those of us who are "Mom & Pop" shops?Since most small organizations fall into the self-assessment category, a great resource is the Security Standards Council SAQ (Self-Assessment Questionnaire) section. Specifically these documents:

    SAQ main page

    PCI DSS SAQ instructions and guidelines

    SAQ: How it all fits together

    SAQ A-D and Guidelines

  • It seems the necessity of PCI compliance hasn't fully penetrated the Asian markets. Do you have any suggestions on how to achieve compliance for companies who do business in Asia, where adjusting to PCI standards aren't a priority?Companies should be compliant regardless of where the payment information is stored, processed or transmitted. Even if processors in a particular locale aren't as focused on the standard, the companies (merchants/retailers) with operations in those locales should implement the same controls as they do in other areas of the globe.

  • If card data is entered via the virtual terminal of a third-party on a desktop PC where wireless is not enabled, do I need wireless scans?All wireless networks within the CDE (cardholder data environment) need to be scanned pursuant to the PCI DSS wireless guidelines provided by the Council. If audit and test findings confirm there is no wireless on the virtual terminal and there is no wireless within the CDE, additional scans are not required (for example, note that the wireless scanning requirement is not addressed in SAQ C-VT specific to virtual terminal-only environments). Note, however, that if you use other devices beyond just the virtual terminal to store/process/transmit cardholder data (such as a PoS on your network), you will have to scan.

  • Is there a standard for isolating non-compliant custom systems that do not have a newer PCI-compliant version available? Let's assume this would be a software package without encryption in its database.There are two standards for payment software – the PA DSS for commercial software and the PCI DSS for commercial software with significant customization and custom software. If the custom software is saving PANs in an unencrypted format, it is non-compliant with PCI DSS. The best options are to stop saving the PANs and use an alternative -- like masking, tokens or other unique identifier -- or find a way to encrypt the PAN data before it enters the database. If this is not possible, create a document explaining why, list compensating controls (such as increased monitoring and access control) and put in place a road map for mitigating or eliminating the problem. Although the compensating controls/road map will not mean a fully compliant RoC or SAQ, it does show good faith on the part of the company to work towards correcting the problem.

  • In terms of a policy strategy, should an enterprise's existing information security policies be amended to include PCI requirements, or do the requirements need to be addressed in PCI-specific policies?In most cases the CDE (cardholder data environment) under PCI is a very small portion of the network and should be clearly zoned off from the rest of the corporate network activities. As a separate part of the network, a unique policy (or policy set) should apply for that zone. So PCI-specific policies should exist. However, parts of existing policy – for example strong password controls and reset – can be re-used in the PCI-specific policies where applicable.

  • Regarding encryption in requirement 3, if the decryption key is not present in the cardholder environment, is the system out of the scope of PCI?In the FAQ section of the Council site it states: "Encrypted data may be deemed out of scope if, and only if, it has been validated that the entity that possesses encrypted cardholder data does not have the means to decrypt it." So if the entity does not have the key, that data may be deemed out of scope.

  • Does PCI require verification that there are no rogue wireless access points that may have connected to the POS network?Yes. From the Council's Wireless Guidance: "These are requirements that all organizations should have in place to protect their networks from attacks via rogue or unknown wireless access points (APs) and clients. They apply to organizations regardless of their use of wireless technology and regardless of whether the wireless technology is a part of the CDE or not." And, "The purpose of PCI DSS requirement 11.1 is to ensure an unauthorized or rogue wireless device introduced into an organization's network does not allow unmanaged and unsecured WLAN access to the CDE. The intent is to prevent an attacker from using rogue wireless devices to negatively impact the security of cardholder data. In order to combat rogue WLANs, it is acceptable to use a wireless analyzer or a preventative control such as a Wireless Intrusion Detection/Prevention System (IDS/IPS) as defined by the PCI DSS."

  • Where is disaster recovery and business continuity planning covered in the PCI DSS requirements, or is it?Disaster recovery and BCP are not explicitly called out in the 2.0 version of PCI DSS; however, incident response planning is. "12.5.3 - Establish, document, and distribute security incident response and escalation procedures to ensure timely and effective handling of all situations." Also in the Penetration Testing supplement it states: "Perform testing in accordance with critical company processes including change control, business continuity, and disaster recovery." And, in the Application Reviews and Web Application Firewalls Clarified it states: "Adhere to all policies and procedures including change control, business continuity, and disaster recovery."

  • Would you define "scope" as the geographical area of the PCI servers? Or would you define "scope" as the SAQ requirements? It seems at times they are used interchangeably.The scope of the audit surface is the cardholder data environment (CDE). The CDE is "The people, processes and technology that store, process or transmit cardholder data or sensitive authentication data, including any connected system components." So any system component in the CDE is in scope regardless of geographic location.

  • Shared accounts are prohibited according to PCI DSS as I understand it, but imagine if you have your network equipment management outsourced and the firewalls and switches for the cardholder environment are managed by a third party or a service supplier. In this scenario, you would need two-factor authentication for administrative access to the CHE, but what if the service provider/supplier has several technicians and you are using RSA tokens? Do you have to supply one authentication account and one RSA token per technician? Or is it necessary only to supply one account and one RSA token for the service provider/supplier? You're right that shared accounts are prohibited by PCI DSS; Requirement 8 states: "Assign a unique ID to each person with computer access." Strictly speaking, to be compliant, a unique ID and two-factor token would need to be assigned for each person remotely administering the firewalls and switches.

  • Can you speak to some of the feedback you have received from clients who have implemented a tokenization product, including some of the key areas to focus on when selecting a vendor?We've received positive feedback from companies that use tokenization in the CDE to reduce scope. One that we spoke to and have mentioned publicly is Helzberg Diamond Shops, Inc.. However, we caution that to be completely effective, organizations need to also address scope reduction and zoning, document the tokenization implementation so it can be reviewed during audit, and confirm with your acquirer/processor that tokenization is acceptable. For vendor selection, the Council is working on tokenization guidance, but Visa Inc.has already issued its recommended guidance, Tokenization Best Practices.
  • Speaking from a university standpoint, we take credit cards in many ways -- POS, Internet, MOTO – but we use only PA-DSS applications and we are hosted by a service provider, so we do not store any CHD. Our CHDE is really the PCs (and network) where the card data is entered or swiped. We have segmented all system components (PCs where CHD is entered or swiped) away from our regular network. It appears that many of the PA-DSS requirements are in reference to "stored" credit card data. Can you give me some advice on how to determine how much of the requirements apply to us given that we do not store CHD? We have secured all components that have CHD entered and we are running PA-DSS-compliant applications.Sounds like you've done a lot of great scoping work. The PA-DSS applies to applications, but entities still need to be PCI DSS compliant. Since your applications are already PA-DSS compliant, focus instead on what matters to your university, which is attesting to PCI DSS compliance. If your transactions levels qualify you for self-assessment review, the self-assessment guidelines (please see question 1 for more information) and determine which one applies and complete that. In general, if you fall under multiple SAQs your acquirer/processer will want you to complete SAQ –D. However, to be sure, check with your acquirer/processor to confirm.
  • Can you offer advice on what to look for in an internal audit and reporting product for PCI DSS compliance?There are multiple audit and reporting tool types that can be used in PCI DSS compliance. For example, a penetration testing system will return reports on vulnerabilities and exposures in the CDE, while a patching system will return reports on patch information, both of which apply. In many cases, when organizations think about a meta-console for reporting, it is a log or event/information aggregation console that brings together multiple reporting components for use in PCI DSS compliance work. For any tool, look for the ability to check for issues specific to PCI DSS (ex: password policy on servers and applications in the CDE) and report on these in a template that maps the finding to the specific requirement.

  • I have a question about PCI and the cloud. We are a PCI Level 1 merchant. We are thinking of moving our data center to cloud, Amazon to be specific. We understand that Amazon is PCI Level 1 compliant. Is it really possible to be a PCI-compliant Level 1 merchant in a cloud environment? Do you have any guidance regarding PCI in a cloud environment?Amazon.com Inc. (Amazon Web Services – AWS) is, as of this writing, a PCI DSS Validated Service Provider. However, using AWS, or any Validated Service Provider, does not eliminate the need to entity using the service to be PCI DSS compliant . As Amazon notes, "All merchants must manage their own PCI certification. For the portion of the PCI cardholder environment deployed in AWS, your QSA can rely on our validated service provider status, but you will still be required to satisfy all other PCI compliance and testing requirements that don't deal with the technology infrastructure, including how you manage the cardholder environment that you host with AWS." So while a cloud provider can be third party validated as a PCI DSS provider, this doesn't mean they're certified to PCI or that entities using the service are automatically certified.

    If you are going to host some or all of your CDE in the cloud, do so with a compliant provider. However, don't forget to annually check that the provider is remaining compliant with your CDE, as well as the parts of your CDE that are hosted in the cloud. Additionally, according to the PCI Security Standards, your RoC must "document the role of each service provider, clearly identifying which requirements apply to the assessed entity and which apply to the service provider." And:

    "12.8 – If cardholder data is shared with service providers, maintain and implement policies and procedures to manage service providers, to include the following:

    12.8.1 – Maintain a list of service providers.

    12.8.2 –Maintain a written agreement that includes an acknowledgement that the service providers are responsible for the security of cardholder data that the service providers possess.

    12.8.3 - Ensure there is an established process for engaging service providers including proper due diligence prior to engagement.

    12.8.4 - Maintain a program to monitor service providers' PCI DSS compliance status at least annually"

  • In effort to ensure PCI compliance, we have a number of different products from different vendors, since there does not seem to be one full PCI compliance "solution." Is this by design? Is there any advantage to having each requirement met by a different vendor's product?There are a number of components in PCI compliance and they encompass people, process and technology, and span both the physical and the logical. Also, all of the documentation related to policies and process. It would be extremely difficult (arguably impossible) for a single solution to do it all. The reality is that organizations use a number of different vendor solutions for the technical controls.

    Some vendors provide products that meet different controls. For example, a vendor with a log aggregation or SIEM tool that also sells antivirus/malware or patch management. The big win is not necessarily to have all tools (or many tools) from the same vendor, but to be able to bring together reporting, logs, test and monitoring information in a centralized place to make oversight and compliance monitoring more comprehensive and efficient.

  • How can companies deal with call recordings in the call center when taking card payments by phone? Are there any mitigating factors?Because there is not a lot of call center guidance in the PCI DSS, the Council addressed call center issues in a special FAQ #5362. "The Council's position remains that if you can digitally query sensitive authentication data (SAD) contained within audio recordings - if SAD is easily accessible - then it must not be stored."

    Though this is not hosted on the PCI Security Standard Council Domain -- it is the official FAQ for the Council and can be accessed directly by clicking in the FAQs link at the top of the official Council page.

    Also, please see question below for additional information on storage rules regarding sensitive authentication data (SAD).

  • Our call-recording solution requires manual intervention to bleep out the CV2 number. Is this sufficient as a compensating control to meet the standard?

    If the CV2 (or any other sensitive authentication data/SAD) is not stored, this should meet the standard. Document how the manual process is implemented to ensure SAD is truly being deleted and not stored.

    Alternately, according to PCI Security Standards Council FAQ "If these recordings cannot be data mined, storage of CAV2, CVC2, CVV2 or CID codes after authorization may be permissible as long as appropriate validation has been performed. This includes the physical and logical protections defined in PCI DSS that must still be applied to these call recording formats."

  • If you have backups of credit card data in a secure location, is that a violation? How can it be mitigated?It's not a violation -- it is part of a requirement! Requirement 9.5 explicitly states: "Store media back-ups in a secure location, preferably an off-site facility, such as an alternate or back-up site, or a commercial storage facility. Review the location's security at least annually." Remember to make sure the data was encrypted before it was backed up and that the personnel at the facility do not have the key to decrypt the data.

  • What are the rules for external scanning?External scanning is covered in Requirement 11.2.2 – "Perform quarterly external vulnerability scans via an Approved Scanning Vendor (ASV), approved by the Payment Card Industry Security Standards Council (PCI SSC).

    Note: Quarterly external vulnerability scans must be performed by an Approved Scanning Vendor (ASV), approved by the Payment Card Industry Security Standards Council (PCI SSC). Scans conducted after network changes may be performed by internal staff." 

    See the PCI Security Standard for a list of ASVs

    Also helpful is the ASV Program Guide, and the ASV Client Feedback Form

  • PCI 2.0 lightly touches upon virtualization for the first time. Does this extend beyond virtual machine images to virtual appliances (e.g. use of virtual firewalls & virtual switches in hosted products)?Yes, according to the Scope of Assessment for Compliance it does extend to virtual appliances. "System components" in v2.0 include, "any virtualization components such as virtual machines, virtual switches/routers, virtual appliances, virtual applications/desktops, and hypervisors." Also note that virtualization is mentioned in Requirement 2.2.1: Implement only one primary function per server, "Note: Where virtualization technologies are in use, implement only one primary function per virtual system component."

  • Is a system that is not holding the cardholder data, but only processing it (like a Web farm) a part of PCI audit requirements?Yes, if a system component stores, processes or transmits cardholder data or sensitive authentication data, it is part of the CDE and within scope of the PCI DSS audit. For additional guidance, refer to the Scope of Assessment for Compliance with PCI DSS requirements section of PCI DSS v2.0.

  • When do companies have to switch over to PCI 2.0?For the absolute final word on compliance deadlines, check with your acquirer or specific card brand. In general, however, v2.0 went into effect on January 1, 2011 and there is a year to comply with the new standard. If you are in the middle of an assessment cycle that started in 2010 and the compliance assessment will be completed before the end of 2011, you can continue the process with v1.2.1. If you a starting a new assessment cycle in 2011, use v2.0.

  • If an organization has filled out the self assessment questionnaire (SAQ) and identified that it has not complied with the 12 DSS requirements, should the SAQ still be submitted? Or should the organization wait until the 12 requirements have been satisfied?Before admitting defeat, see if there is any way your organization can get to be compliant. Don't forget, if a non-compliant system or process is not essential, it could be scoped out of the CDE and out of the compliance surface. Also don't forget about compensating controls. The ideal is to be fully compliant, but compensating controls provide a way for organizations to be mitigating risks as they work towards implementing better controls.

    According to the Compensating Controls Appendix B in SAQ D v2.0: "Compensating controls may be considered for most PCI DSS requirements when an entity cannot meet a requirement explicitly as stated, due to legitimate technical or documented business constraints, but has sufficiently mitigated the risk associated with the requirement through implementation of other, or compensating, controls." Also, there is a compensating control worksheet that needs to be completed in Appendix C of the SAQ D v2.0.

    If de-scoping the non-compliant system and compensating controls are not options, then you will need to check the "Non-Compliant" box on the SAQ and put in a target date for compliance. In most cases, your acquirer/processor will want to see this proof, and possibly ask your organization to fill out the "Action Plan" part of the SAQ; however, check with your acquirer/processor to be sure.

  • Let's talk about the mythical beast that is end-to-end encryption. Does it exist? More specifically, one of our audience members asked, "What if end-to-end encryption from the pin pad / card swipe POS is implemented? Does that take everything out of PCI scope?"The Council is calling this P2PE for point-to-point encryption. Meaning turning the cardholder data into ciphertext (encrypting it) and then transmitting it, encrypted to a destination, for example, the payment processor. If the P2PE begins on swipe by cashier of the credit card at the PoS (point of sale) and continues all the way to the processor, it is not stored, and no one in the interim path has the keys to decrypt the data, then it could reduce the scope of the audit surface significantly. Caveats here are that everything will need to be implemented correctly, validated and tested. However, note that the entity still must be PCI DSS compliant – though compliance may be greatly simplified. And, at this time, the PCI Security Standards Council still deems P2PE an emerging technology and is formalizing official guidance, training QSAs on how to evaluate relevant P2PE components, as well as considering creating a validated list of P2PE solutions. For more information on the status of P2PE, please read the Initial Roadmap: Point-to-Point Encryption Technology and PCI DSS Compliance program guide.

  • Under what circumstances can an internal audit certify a merchant as being PCI compliant?If the merchant qualifies for SAQ completion, internal audit can be responsible for the assessment and attestation process. "Each payment card brand has defined specific requirements for compliance validation and reporting, such as provisions for performing self-assessments and when to engage a QSA."

    If the merchant must complete a RoC, it is possible to do the on-site assessment with an internal resource if the brand allows it. Check with your brand for specifics, Mastercard Inc., for example, has deemed that as of June 30, 2011, the "primary internal auditor staff engaged in validating PCI DSS compliance [must] attend PCI SSC ISA Training and pass the associated accreditation program annually."

  • What PCI and security implications do you anticipate arising with the new generation of contact-less cards, given that they are now being widely distributed?If the data can be transmitted in a secure encrypted format over the RF from the contact-less card to a secure endpoint, the data should not be exposed. However, if the data from the card is in clear-text over the air, sniffing attacks will be a major concern. Also, key management and MiTMs may be problems depending on specific technical implementations.

  • Are quarterly penetration tests still required for wireless access points that are using WPA-2?Yes, quarterly tests are required. Requirement 11.1 covers all known/unknown wireless access points regardless of protections on them. "11.1 - Test for the presence of wireless access points and detect unauthorized wireless access points on a quarterly basis." The reason for this is that one of the intents of this requirement is to ensure there are no rogue devices in the CDE.

  • Does Citrix sessioning between payment apps and hosted sites provide sufficient encryption for PCI compliance?If the session is configured to transmit the data between the payment apps and the hosted site using an approved method (ex: SSL/TLS ) then it should be compliant for at least the transmission portion of the standard.

    Requirement 4.1 -- "Use strong cryptography and security protocols (for example, SSL/TLS, IPSEC, SSH, etc.) to safeguard sensitive cardholder data during transmission over open, public networks."

  • How much are organizations spending on PCI compliance? Can you provide a range both for one-time costs and annual maintenance?There are two sides to this coin: cost of the audit and cost of compliance overall.
  • Audit cost: According to a recent Ponemon survey on PCI DSS trends (.pdf), the average cost of the audit itself is $225,000 for the largest (Tier 1) merchants, but the cost can range much higher or lower depending on complexity of the environment, size of the CDE, and other factors .

  • Overall cost of compliance: In 2008, Gartner conducted a survey of 50 merchants and found that PCI costs had been increasing since 2006 (Gartner.com registration required) and cited costs averaging 2.7M for Tier 1 merchants, 1.1M for Tier 2, and 155k for Tier 3. Again, these are averages, so your particular case might be different.
  • Requirement 2.2.1 mandates that critical servers provide a single-purpose service. If I have a single server hosting an e-commerce application with a Web server and database residing on a physical server, do I need to place the database on a separate server?Yes, in most cases. Requirement 2.2.1 – "Implement only one primary function per server to prevent functions that require different security levels from co-existing on the same server." The intent of this requirement is to provide some protections if the underlying host, in this case the operation system running the database and e-commerce application is breached, causing one or both of the services to be exposed to attack. VMs are now allowed, so the same piece of hardware could be used with a hypervisor to separate the two services across two VMs. Alternately, if there is a critical business need, such as performance, for both primary functions to be on the same server, consider if this justifies a compensating control by completing the compensating control worksheet (Appendix C of the PCI DSS).
  • About the author:Ed Moyle is currently a manager with CTG's Information Security Solutions practice, providing strategy, consulting, and solutions to clients worldwide as well as a founding partner of SecurityCurve.

    Diana Kelley is a partner with Amherst, N.H.-based consulting firm SecurityCurve. She formerly served as vice president and service director with research firm Burton Group. She has extensive experience creating secure network architectures and business solutions for large corporations and delivering strategic, competitive knowledge to security software vendors.



    Direct Download of over 5500 Certification Exams

    3COM [8 Certification Exam(s) ]
    AccessData [1 Certification Exam(s) ]
    ACFE [1 Certification Exam(s) ]
    ACI [3 Certification Exam(s) ]
    Acme-Packet [1 Certification Exam(s) ]
    ACSM [4 Certification Exam(s) ]
    ACT [1 Certification Exam(s) ]
    Admission-Tests [13 Certification Exam(s) ]
    ADOBE [93 Certification Exam(s) ]
    AFP [1 Certification Exam(s) ]
    AICPA [2 Certification Exam(s) ]
    AIIM [1 Certification Exam(s) ]
    Alcatel-Lucent [13 Certification Exam(s) ]
    Alfresco [1 Certification Exam(s) ]
    Altiris [3 Certification Exam(s) ]
    Amazon [2 Certification Exam(s) ]
    American-College [2 Certification Exam(s) ]
    Android [4 Certification Exam(s) ]
    APA [1 Certification Exam(s) ]
    APC [2 Certification Exam(s) ]
    APICS [2 Certification Exam(s) ]
    Apple [69 Certification Exam(s) ]
    AppSense [1 Certification Exam(s) ]
    APTUSC [1 Certification Exam(s) ]
    Arizona-Education [1 Certification Exam(s) ]
    ARM [1 Certification Exam(s) ]
    Aruba [6 Certification Exam(s) ]
    ASIS [2 Certification Exam(s) ]
    ASQ [3 Certification Exam(s) ]
    ASTQB [8 Certification Exam(s) ]
    Autodesk [2 Certification Exam(s) ]
    Avaya [96 Certification Exam(s) ]
    AXELOS [1 Certification Exam(s) ]
    Axis [1 Certification Exam(s) ]
    Banking [1 Certification Exam(s) ]
    BEA [5 Certification Exam(s) ]
    BICSI [2 Certification Exam(s) ]
    BlackBerry [17 Certification Exam(s) ]
    BlueCoat [2 Certification Exam(s) ]
    Brocade [4 Certification Exam(s) ]
    Business-Objects [11 Certification Exam(s) ]
    Business-Tests [4 Certification Exam(s) ]
    CA-Technologies [21 Certification Exam(s) ]
    Certification-Board [10 Certification Exam(s) ]
    Certiport [3 Certification Exam(s) ]
    CheckPoint [41 Certification Exam(s) ]
    CIDQ [1 Certification Exam(s) ]
    CIPS [4 Certification Exam(s) ]
    Cisco [318 Certification Exam(s) ]
    Citrix [47 Certification Exam(s) ]
    CIW [18 Certification Exam(s) ]
    Cloudera [10 Certification Exam(s) ]
    Cognos [19 Certification Exam(s) ]
    College-Board [2 Certification Exam(s) ]
    CompTIA [76 Certification Exam(s) ]
    ComputerAssociates [6 Certification Exam(s) ]
    Consultant [2 Certification Exam(s) ]
    Counselor [4 Certification Exam(s) ]
    CPP-Institue [2 Certification Exam(s) ]
    CPP-Institute [1 Certification Exam(s) ]
    CSP [1 Certification Exam(s) ]
    CWNA [1 Certification Exam(s) ]
    CWNP [13 Certification Exam(s) ]
    Dassault [2 Certification Exam(s) ]
    DELL [9 Certification Exam(s) ]
    DMI [1 Certification Exam(s) ]
    DRI [1 Certification Exam(s) ]
    ECCouncil [21 Certification Exam(s) ]
    ECDL [1 Certification Exam(s) ]
    EMC [129 Certification Exam(s) ]
    Enterasys [13 Certification Exam(s) ]
    Ericsson [5 Certification Exam(s) ]
    ESPA [1 Certification Exam(s) ]
    Esri [2 Certification Exam(s) ]
    ExamExpress [15 Certification Exam(s) ]
    Exin [40 Certification Exam(s) ]
    ExtremeNetworks [3 Certification Exam(s) ]
    F5-Networks [20 Certification Exam(s) ]
    FCTC [2 Certification Exam(s) ]
    Filemaker [9 Certification Exam(s) ]
    Financial [36 Certification Exam(s) ]
    Food [4 Certification Exam(s) ]
    Fortinet [12 Certification Exam(s) ]
    Foundry [6 Certification Exam(s) ]
    FSMTB [1 Certification Exam(s) ]
    Fujitsu [2 Certification Exam(s) ]
    GAQM [9 Certification Exam(s) ]
    Genesys [4 Certification Exam(s) ]
    GIAC [15 Certification Exam(s) ]
    Google [4 Certification Exam(s) ]
    GuidanceSoftware [2 Certification Exam(s) ]
    H3C [1 Certification Exam(s) ]
    HDI [9 Certification Exam(s) ]
    Healthcare [3 Certification Exam(s) ]
    HIPAA [2 Certification Exam(s) ]
    Hitachi [30 Certification Exam(s) ]
    Hortonworks [4 Certification Exam(s) ]
    Hospitality [2 Certification Exam(s) ]
    HP [746 Certification Exam(s) ]
    HR [4 Certification Exam(s) ]
    HRCI [1 Certification Exam(s) ]
    Huawei [21 Certification Exam(s) ]
    Hyperion [10 Certification Exam(s) ]
    IAAP [1 Certification Exam(s) ]
    IAHCSMM [1 Certification Exam(s) ]
    IBM [1530 Certification Exam(s) ]
    IBQH [1 Certification Exam(s) ]
    ICAI [1 Certification Exam(s) ]
    ICDL [6 Certification Exam(s) ]
    IEEE [1 Certification Exam(s) ]
    IELTS [1 Certification Exam(s) ]
    IFPUG [1 Certification Exam(s) ]
    IIA [3 Certification Exam(s) ]
    IIBA [2 Certification Exam(s) ]
    IISFA [1 Certification Exam(s) ]
    Intel [2 Certification Exam(s) ]
    IQN [1 Certification Exam(s) ]
    IRS [1 Certification Exam(s) ]
    ISA [1 Certification Exam(s) ]
    ISACA [4 Certification Exam(s) ]
    ISC2 [6 Certification Exam(s) ]
    ISEB [24 Certification Exam(s) ]
    Isilon [4 Certification Exam(s) ]
    ISM [6 Certification Exam(s) ]
    iSQI [7 Certification Exam(s) ]
    ITEC [1 Certification Exam(s) ]
    Juniper [63 Certification Exam(s) ]
    LEED [1 Certification Exam(s) ]
    Legato [5 Certification Exam(s) ]
    Liferay [1 Certification Exam(s) ]
    Logical-Operations [1 Certification Exam(s) ]
    Lotus [66 Certification Exam(s) ]
    LPI [24 Certification Exam(s) ]
    LSI [3 Certification Exam(s) ]
    Magento [3 Certification Exam(s) ]
    Maintenance [2 Certification Exam(s) ]
    McAfee [8 Certification Exam(s) ]
    McData [3 Certification Exam(s) ]
    Medical [69 Certification Exam(s) ]
    Microsoft [368 Certification Exam(s) ]
    Mile2 [2 Certification Exam(s) ]
    Military [1 Certification Exam(s) ]
    Misc [1 Certification Exam(s) ]
    Motorola [7 Certification Exam(s) ]
    mySQL [4 Certification Exam(s) ]
    NBSTSA [1 Certification Exam(s) ]
    NCEES [2 Certification Exam(s) ]
    NCIDQ [1 Certification Exam(s) ]
    NCLEX [2 Certification Exam(s) ]
    Network-General [12 Certification Exam(s) ]
    NetworkAppliance [36 Certification Exam(s) ]
    NI [1 Certification Exam(s) ]
    NIELIT [1 Certification Exam(s) ]
    Nokia [6 Certification Exam(s) ]
    Nortel [130 Certification Exam(s) ]
    Novell [37 Certification Exam(s) ]
    OMG [10 Certification Exam(s) ]
    Oracle [269 Certification Exam(s) ]
    P&C [2 Certification Exam(s) ]
    Palo-Alto [4 Certification Exam(s) ]
    PARCC [1 Certification Exam(s) ]
    PayPal [1 Certification Exam(s) ]
    Pegasystems [11 Certification Exam(s) ]
    PEOPLECERT [4 Certification Exam(s) ]
    PMI [15 Certification Exam(s) ]
    Polycom [2 Certification Exam(s) ]
    PostgreSQL-CE [1 Certification Exam(s) ]
    Prince2 [6 Certification Exam(s) ]
    PRMIA [1 Certification Exam(s) ]
    PsychCorp [1 Certification Exam(s) ]
    PTCB [2 Certification Exam(s) ]
    QAI [1 Certification Exam(s) ]
    QlikView [1 Certification Exam(s) ]
    Quality-Assurance [7 Certification Exam(s) ]
    RACC [1 Certification Exam(s) ]
    Real-Estate [1 Certification Exam(s) ]
    RedHat [8 Certification Exam(s) ]
    RES [5 Certification Exam(s) ]
    Riverbed [8 Certification Exam(s) ]
    RSA [15 Certification Exam(s) ]
    Sair [8 Certification Exam(s) ]
    Salesforce [5 Certification Exam(s) ]
    SANS [1 Certification Exam(s) ]
    SAP [98 Certification Exam(s) ]
    SASInstitute [15 Certification Exam(s) ]
    SAT [1 Certification Exam(s) ]
    SCO [10 Certification Exam(s) ]
    SCP [6 Certification Exam(s) ]
    SDI [3 Certification Exam(s) ]
    See-Beyond [1 Certification Exam(s) ]
    Siemens [1 Certification Exam(s) ]
    Snia [7 Certification Exam(s) ]
    SOA [15 Certification Exam(s) ]
    Social-Work-Board [4 Certification Exam(s) ]
    SpringSource [1 Certification Exam(s) ]
    SUN [63 Certification Exam(s) ]
    SUSE [1 Certification Exam(s) ]
    Sybase [17 Certification Exam(s) ]
    Symantec [134 Certification Exam(s) ]
    Teacher-Certification [4 Certification Exam(s) ]
    The-Open-Group [8 Certification Exam(s) ]
    TIA [3 Certification Exam(s) ]
    Tibco [18 Certification Exam(s) ]
    Trainers [3 Certification Exam(s) ]
    Trend [1 Certification Exam(s) ]
    TruSecure [1 Certification Exam(s) ]
    USMLE [1 Certification Exam(s) ]
    VCE [6 Certification Exam(s) ]
    Veeam [2 Certification Exam(s) ]
    Veritas [33 Certification Exam(s) ]
    Vmware [58 Certification Exam(s) ]
    Wonderlic [2 Certification Exam(s) ]
    Worldatwork [2 Certification Exam(s) ]
    XML-Master [3 Certification Exam(s) ]
    Zend [6 Certification Exam(s) ]





    References :


    Issu : https://issuu.com/trutrainers/docs/p8010-004
    Dropmark : http://killexams.dropmark.com/367904/11373745
    Wordpress : http://wp.me/p7SJ6L-cq
    weSRCH : https://www.wesrch.com/business/prpdfBU1HWO000QWSS
    Scribd : https://www.scribd.com/document/356684871/Pass4sure-P8010-004-Practice-Tests-with-Real-Questions
    Dropmark-Text : http://killexams.dropmark.com/367904/11997807
    Youtube : https://youtu.be/oBKW0-Nclqw
    Blogspot : http://killexams-braindumps.blogspot.com/2017/10/pass4sure-p8010-004-ibm-commerce.html
    Vimeo : https://vimeo.com/239419610
    publitas.com : https://view.publitas.com/trutrainers-inc/p8010-004-pdfreal-p8010-004-questions-that-appeared-in-test-today
    Google+ : https://plus.google.com/112153555852933435691/posts/H74EDcqC62D?hl=en
    Calameo : http://en.calameo.com/account/book#
    Box.net : https://app.box.com/s/08v3ptml8jxcc6jq3urq7w9xeqfoxke8
    zoho.com : https://docs.zoho.com/file/5bym2bfb69b3ef74a4dbabff2cd936cc16e7c
    coursehero.com : "Excle"






    View Complete PDF »

    We Make Sure Q&A work for you!

    See Entry Test Preparation   |   Project Management, English Tests Home

    Pass4sure PDFs (Pass4sure Questions and Answers), Viewable at all devices like PC Windows (all versions), Linux (All versions), Mac / iOS (iPhone/iPad and all other devices), Android (All versions). It support High Quality Printable book format. You can print and carry anywhere with you, as you like.

    Testing and Training Engine Software (Pass4sure Exam Simulator) Compatible with All Windows PC (Windows 10/9/8/7/Vista/XP/2000/98 etc). Mac (Through Wine, Virtual Windows PC, Dual boot). It prepares your test for all the topics of exam, gives you exam tips and tricks by asking tricky questions, uses latest practice quiz to train you for the real test taking experience in learning mode as well as real test mode. Provides performance graphs and training history etc.

    View Complete Article »

    More Useful Links about P8010-004

    Certification Vendors Here   |   View Exams, Latest Home

    Information Links



    References:


    P8010-004 brain dump | P8010-004 bootcamp | P8010-004 real questions | P8010-004 practical test | P8010-004 practice questions | P8010-004 test prep | P8010-004 study material | P8010-004 exam prep | P8010-004 study guide | P8010-004 online exam | P8010-004 training material | P8010-004 mock test | P8010-004 mock exam | P8010-004 free practice tests | P8010-004 free test | P8010-004 test answers | P8010-004 online test | P8010-004 test questions | P8010-004 exam questions | P8010-004 exam papers | P8010-004 assessment test sample | P8010-004 reading practice test | P8010-004 practice test | P8010-004 test questions | P8010-004 exam prep | P8010-004 online exam | P8010-004 free prep | P8010-004 exam answers | P8010-004 sample test questions | P8010-004 test exam | P8010-004 exam results | P8010-004 free exam papers | P8010-004 exam dumps | P8010-004 past bar exams | P8010-004 exam preparation | P8010-004 free online test | P8010-004 practice exam | P8010-004 test questions and answers | P8010-004 exam test | P8010-004 test sample | P8010-004 sample test | P8010-004 test practice | P8010-004 free test online | P8010-004 question test | P8010-004 model question | P8010-004 exam tips | P8010-004 certification sample | P8010-004 pass exam | P8010-004 prep questions | P8010-004 entrance exam | P8010-004 essay questions | P8010-004 sample questions | P8010-004 study questions | P8010-004 mock questions | P8010-004 test example | P8010-004 past exams | P8010-004 quest bars

    Download Free PDF »

    Services Overview

    We provide Pass4sure Questions and Answers and exam simulators for the candidates to prepare their exam and pass at first attempt.

    Contact Us

    As a team are working hard to provide the candidates best study material with proper guideline to face the real exam.

    Address: 15th floor, 7# building 16 Xi Si Huan.
    Telephone: +86 10 88227272
    FAX: +86 10 68179899
    Others: +301 - 0125 - 01258
    E-mail: info@Killexams.com



       

       

     

     



    .
     

      .