You are viewing a plain text version of this content. The canonical link for it is here.
Posted to solr-user@lucene.apache.org by Benson Margulies <bi...@gmail.com> on 2013/05/29 03:00:08 UTC

Not so concurrent concurrency

 I can't quite apply SolrMeter to my problem, so I did something of my
own. The brains of the operation are the function here.

This feeds a ConcurrentUpdateSolrServer about 95 documents, each about
10mb, and 'threads' is six. Yet Solr just barely uses more than one
core.

   private long doIteration(File[] filesToRead) throws IOException,
SolrServerException {
        ConcurrentUpdateSolrServer concurrentServer = new
ConcurrentUpdateSolrServer(launcher.getSolrServer().getBaseURL(),
1000, threads);
        UpdateRequest updateRequest = new UpdateRequest(updateUrl);
        updateRequest.setCommitWithin(1);
        Stopwatch stopwatch = new Stopwatch();

        List<File> allFiles = Arrays.asList(filesToRead);
        Iterator<File> fileIterator = allFiles.iterator();
        while (fileIterator.hasNext()) {
            List<File> thisBatch = Lists.newArrayList();
            int batchByteCount = 0;
            while (batchByteCount < BATCH_LIMIT && fileIterator.hasNext()) {
                File thisFile = fileIterator.next();
                thisBatch.add(thisFile);
                batchByteCount += thisFile.length();
            }
            LOG.info(String.format("update %s files", thisBatch.size()));
            updateRequest.setDocIterator(new
StreamingDocumentIterator(thisBatch));
            stopwatch.start();
            concurrentServer.request(updateRequest);
            concurrentServer.blockUntilFinished();
            stopwatch.stop();
        }