Previously you responded to a question concerning sorting data by project
number. Can this be done for a title, ie XXX-contract1, XXX-contract2,
YYY-Contract1 or some variation of this


Subject: Re: Vlookup to Return a Range of Data 5/29/2006 5:57 PM PST

By: Biff In: microsoft.public.excel.misc


Ok......

Here's how to do this using formulas. You can play around with it and see if
it'll fit your needs. If not, see Dave's reply.

We'll use just the small sample of data you posted.

> A B C D
> 1 Store # Account # Date Amount
> 2 2885 120000 05/17/06 100.00
> 3 2950 130405 06/17/06 50.00
> 4 2885 130402 04/20/04 200.00
> 5 2950 126210 08/17/05 50.00


Assume that data is in a sheet named Dump in the range A1:B5, A1:D1 being
the column headers.

Create a sheet and name it 2885 for store # 2885.

In A1 enter this formula:

=MID(CELL("filename",A1),FIND("]",CELL("filename",A1))+1,255)

This will return the sheet name 2885. This will also come into play later on
when creating sheets for the other stores.

In B1 enter this formula:

=COUNTIF(Dump!A:A,A1)

This will return the number of rows of data that there are in sheet Dump for
store # 2885.

Enter these column headers in A3:C3: Account #, Date, Amount

Enter this formula in A4. This is an array formula and MUST be entered using
the key combination of CTRL,SHIFT,ENTER:

=IF(ROWS($1:1)<=$B$1,INDEX(Dump!B$2:B$5,SMALL(IF(Dump!$A$2:$A$5=$A$1+0,ROW(Dump!B$2:B$5)-ROW(Dump!B$2)+1),ROWS($1:1))),"")

Copy across to C4 then down. You'll need to copy the formula to enough rows
that all the data for store # 2885 is returned. In your reply you say this
typically about 30 to 40 rows, so, copy down maybe 50 rows.

That's it for store # 2885.

Now, let's create sheets for the other stores. This is pretty cool.........

Select sheet 2885.

Right click on the sheet tab.

Select Move or Copy....

Select Create a copy

OK

Select the copied sheet that has the name 2885 (2)

Right click the sheet tab and select Rename.

Rename the sheet 2950 (based on the store numbers of your posted sample
data)

That's it for store # 2950

Now, just repeat this process for the other store #'s.

Biff

"Rob" <[email protected]> wrote in message
news:[email protected]...
> Hi Biff,
>
> Here are the answers to your questions ;
>
> *Typically, there will probably be about 30-40 lines of data for a
> specific
> store.
>
> *Over the course of a year, there will be 200 different stores that need
> to
> be analyzed.
>
> *The "Dump" of data can be sorted anyway in which it needs to be (Store #)
>
> *The "Dump" of data will be updated every month.
>
> -Basically, as we open new stores we need to analyze all data for that new
> store. We update this particular file every month - and add new stores
> (tabs)
> as required.
>
> Hope this helps.
> Thanks!
> Rob
>
>
> "Biff" wrote:
>
>> Ok, one more question, well, maybe two or three.....
>>
>> Out of those 4000 rows approximately how many will typically be
>> associated
>> with store # 2885?
>>
>> How many different store #'s are there? I assume you want to extract the
>> data for each different store to its own sheet?
>>
>> Can you sort the "dump" sheet by store # ?
>>
>> Is the "dump" static or does it change on a regular basis (a new "dump"
>> everyday?) Is the size of the "dump" always the same?
>>
>> A lot of questions but this is the type of info that determines your best
>> approach to this. This *could* be done with formulas but it probably
>> isn't
>> the best approach. If you do this regularly I'd turn on the macro
>> recorder
>> then apply a filter and start filtering on the store #'s, copy/paste to
>> the
>> appropriate sheets. Do it once manually and thereafter you have the macro
>> to
>> do it for you automatically.
>>
>> Biff
>>
>> "Rob" <[email protected]> wrote in message
>> news:[email protected]...
>> > The dump of data starts in A1 and goes to E4000
>> >
>> > "Biff" wrote:
>> >
>> >> How big is "the dump" ?
>> >>
>> >> x columns by x rows
>> >>
>> >> Biff
>> >>
>> >> "Rob" <[email protected]> wrote in message
>> >> news:[email protected]...
>> >> > The only issue is this - the "dump" of data is located on a
>> >> > different
>> >> > tab.
>> >> >
>> >> > Example - tab #1 is the dump of data. Tab #2 is a specific store
>> >> > (2885),
>> >> > and
>> >> > tab #3 is another store - and so on.
>> >> >
>> >> > So I basically need to get all the individual store data on a
>> >> > specific
>> >> > tab
>> >> > (based on the criteria).
>> >> >
>> >> > Thanks!
>> >> > Rob
>> >> >
>> >> > "Nick Hodge" wrote:
>> >> >
>> >> >> You should look at setting up an autofilter.
>> >> >>
>> >> >> Data>Filter>Autofilter and filter on the store#
>> >> >>
>> >> >> --
>> >> >> HTH
>> >> >> Nick Hodge
>> >> >> Microsoft MVP - Excel
>> >> >> Southampton, England
>> >> >> www.nickhodge.co.uk
>> >> >> [email protected]HIS
>> >> >>
>> >> >>
>> >> >> "Rob" <[email protected]> wrote in message
>> >> >> news:[email protected]...
>> >> >> > Hello,
>> >> >> >
>> >> >> > Heres my issue - I have a single piece of criteria (ie - Store
>> >> >> > #),
>> >> >> > and
>> >> >> > given
>> >> >> > this, I need to look at a large dump of data, and return all the
>> >> >> > values
>> >> >> > for
>> >> >> > this particular store number.
>> >> >> >
>> >> >> > Example :
>> >> >> > Here is the data dump :
>> >> >> > A B C D
>> >> >> > Store # Account # Date Amount
>> >> >> > 1 2885 120000 05/17/06 100.00
>> >> >> > 2 2950 130405 06/17/06 50.00
>> >> >> > 3 2885 130402 04/20/04 200.00
>> >> >> > 4 2950 126210 08/17/05 50.00
>> >> >> >
>> >> >> > Now, If I am running a report on Store #2885 - what I need the
>> >> >> > formula
>> >> >> > to
>> >> >> > do
>> >> >> > is give me all pieces of information that relate to that store.
>> >> >> > Here
>> >> >> > is
>> >> >> > what
>> >> >> > the end result of the formula should be :
>> >> >> >
>> >> >> > Store # Account # Date Amount
>> >> >> > 2885 120000 05/17/06 100.00
>> >> >> > 2885 130402 04/20/04 200.00
>> >> >> >
>> >> >> > Can anybody help me ?
>> >> >> >
>> >> >> > Thanks!
>> >> >> > Rob
>> >> >> >
>> >> >> >
>> >> >> >
>> >> >>
>> >> >>
>> >> >>
>> >>
>> >>
>> >>

>>
>>
>>